2026-04-02 00:20:41.762 15367 INFO oslo_service.periodic_task [-] Skipping periodic task _discover_hosts_in_cells because its interval is negative 2026-04-02 00:20:41.921 15367 CRITICAL nova [req-492e3ee9-adc3-4543-af8d-658374ad932c - - - - -] Unhandled error: oslo_db.exception.DBNonExistentTable: (sqlite3.OperationalError) no such table: cell_mappings [SQL: SELECT cell_mappings.created_at AS cell_mappings_created_at, cell_mappings.updated_at AS cell_mappings_updated_at, cell_mappings.id AS cell_mappings_id, cell_mappings.uuid AS cell_mappings_uuid, cell_mappings.name AS cell_mappings_name, cell_mappings.transport_url AS cell_mappings_transport_url, cell_mappings.database_connection AS cell_mappings_database_connection, cell_mappings.disabled AS cell_mappings_disabled FROM cell_mappings ORDER BY cell_mappings.id ASC] (Background on this error at: https://sqlalche.me/e/14/e3q8) 2026-04-02 00:20:41.921 15367 ERROR nova Traceback (most recent call last): 2026-04-02 00:20:41.921 15367 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1802, in _execute_context 2026-04-02 00:20:41.921 15367 ERROR nova self.dialect.do_execute( 2026-04-02 00:20:41.921 15367 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/default.py", line 732, in do_execute 2026-04-02 00:20:41.921 15367 ERROR nova cursor.execute(statement, parameters) 2026-04-02 00:20:41.921 15367 ERROR nova sqlite3.OperationalError: no such table: cell_mappings 2026-04-02 00:20:41.921 15367 ERROR nova 2026-04-02 00:20:41.921 15367 ERROR nova The above exception was the direct cause of the following exception: 2026-04-02 00:20:41.921 15367 ERROR nova 2026-04-02 00:20:41.921 15367 ERROR nova Traceback (most recent call last): 2026-04-02 00:20:41.921 15367 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:20:41.921 15367 ERROR nova sys.exit(main()) 2026-04-02 00:20:41.921 15367 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:20:41.921 15367 ERROR nova server = service.Service.create( 2026-04-02 00:20:41.921 15367 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:20:41.921 15367 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:20:41.921 15367 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:20:41.921 15367 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:20:41.921 15367 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 66, in __init__ 2026-04-02 00:20:41.921 15367 ERROR nova self.host_manager = host_manager.HostManager() 2026-04-02 00:20:41.921 15367 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/host_manager.py", line 334, in __init__ 2026-04-02 00:20:41.921 15367 ERROR nova self.refresh_cells_caches() 2026-04-02 00:20:41.921 15367 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/host_manager.py", line 718, in refresh_cells_caches 2026-04-02 00:20:41.921 15367 ERROR nova temp_cells = objects.CellMappingList.get_all(context) 2026-04-02 00:20:41.921 15367 ERROR nova File "/usr/lib/python3/dist-packages/oslo_versionedobjects/base.py", line 184, in wrapper 2026-04-02 00:20:41.921 15367 ERROR nova result = fn(cls, context, *args, **kwargs) 2026-04-02 00:20:41.921 15367 ERROR nova File "/usr/lib/python3/dist-packages/nova/objects/cell_mapping.py", line 256, in get_all 2026-04-02 00:20:41.921 15367 ERROR nova db_mappings = cls._get_all_from_db(context) 2026-04-02 00:20:41.921 15367 ERROR nova File "/usr/lib/python3/dist-packages/oslo_db/sqlalchemy/enginefacade.py", line 1010, in wrapper 2026-04-02 00:20:41.921 15367 ERROR nova return fn(*args, **kwargs) 2026-04-02 00:20:41.921 15367 ERROR nova File "/usr/lib/python3/dist-packages/nova/objects/cell_mapping.py", line 252, in _get_all_from_db 2026-04-02 00:20:41.921 15367 ERROR nova expression.asc(api_db_models.CellMapping.id)).all() 2026-04-02 00:20:41.921 15367 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/orm/query.py", line 2759, in all 2026-04-02 00:20:41.921 15367 ERROR nova return self._iter().all() 2026-04-02 00:20:41.921 15367 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/orm/query.py", line 2894, in _iter 2026-04-02 00:20:41.921 15367 ERROR nova result = self.session.execute( 2026-04-02 00:20:41.921 15367 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/orm/session.py", line 1692, in execute 2026-04-02 00:20:41.921 15367 ERROR nova result = conn._execute_20(statement, params or {}, execution_options) 2026-04-02 00:20:41.921 15367 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1614, in _execute_20 2026-04-02 00:20:41.921 15367 ERROR nova return meth(self, args_10style, kwargs_10style, execution_options) 2026-04-02 00:20:41.921 15367 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/sql/elements.py", line 325, in _execute_on_connection 2026-04-02 00:20:41.921 15367 ERROR nova return connection._execute_clauseelement( 2026-04-02 00:20:41.921 15367 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1481, in _execute_clauseelement 2026-04-02 00:20:41.921 15367 ERROR nova ret = self._execute_context( 2026-04-02 00:20:41.921 15367 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1845, in _execute_context 2026-04-02 00:20:41.921 15367 ERROR nova self._handle_dbapi_exception( 2026-04-02 00:20:41.921 15367 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 2024, in _handle_dbapi_exception 2026-04-02 00:20:41.921 15367 ERROR nova util.raise_(newraise, with_traceback=exc_info[2], from_=e) 2026-04-02 00:20:41.921 15367 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/util/compat.py", line 207, in raise_ 2026-04-02 00:20:41.921 15367 ERROR nova raise exception 2026-04-02 00:20:41.921 15367 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1802, in _execute_context 2026-04-02 00:20:41.921 15367 ERROR nova self.dialect.do_execute( 2026-04-02 00:20:41.921 15367 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/default.py", line 732, in do_execute 2026-04-02 00:20:41.921 15367 ERROR nova cursor.execute(statement, parameters) 2026-04-02 00:20:41.921 15367 ERROR nova oslo_db.exception.DBNonExistentTable: (sqlite3.OperationalError) no such table: cell_mappings 2026-04-02 00:20:41.921 15367 ERROR nova [SQL: SELECT cell_mappings.created_at AS cell_mappings_created_at, cell_mappings.updated_at AS cell_mappings_updated_at, cell_mappings.id AS cell_mappings_id, cell_mappings.uuid AS cell_mappings_uuid, cell_mappings.name AS cell_mappings_name, cell_mappings.transport_url AS cell_mappings_transport_url, cell_mappings.database_connection AS cell_mappings_database_connection, cell_mappings.disabled AS cell_mappings_disabled 2026-04-02 00:20:41.921 15367 ERROR nova FROM cell_mappings ORDER BY cell_mappings.id ASC] 2026-04-02 00:20:41.921 15367 ERROR nova (Background on this error at: https://sqlalche.me/e/14/e3q8) 2026-04-02 00:20:41.921 15367 ERROR nova 2026-04-02 00:20:48.176 15434 INFO oslo_service.periodic_task [-] Skipping periodic task _discover_hosts_in_cells because its interval is negative 2026-04-02 00:20:48.354 15434 CRITICAL nova [req-b83c85f6-98cf-4a59-90c4-012e90dfa978 - - - - -] Unhandled error: oslo_db.exception.DBNonExistentTable: (sqlite3.OperationalError) no such table: cell_mappings [SQL: SELECT cell_mappings.created_at AS cell_mappings_created_at, cell_mappings.updated_at AS cell_mappings_updated_at, cell_mappings.id AS cell_mappings_id, cell_mappings.uuid AS cell_mappings_uuid, cell_mappings.name AS cell_mappings_name, cell_mappings.transport_url AS cell_mappings_transport_url, cell_mappings.database_connection AS cell_mappings_database_connection, cell_mappings.disabled AS cell_mappings_disabled FROM cell_mappings ORDER BY cell_mappings.id ASC] (Background on this error at: https://sqlalche.me/e/14/e3q8) 2026-04-02 00:20:48.354 15434 ERROR nova Traceback (most recent call last): 2026-04-02 00:20:48.354 15434 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1802, in _execute_context 2026-04-02 00:20:48.354 15434 ERROR nova self.dialect.do_execute( 2026-04-02 00:20:48.354 15434 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/default.py", line 732, in do_execute 2026-04-02 00:20:48.354 15434 ERROR nova cursor.execute(statement, parameters) 2026-04-02 00:20:48.354 15434 ERROR nova sqlite3.OperationalError: no such table: cell_mappings 2026-04-02 00:20:48.354 15434 ERROR nova 2026-04-02 00:20:48.354 15434 ERROR nova The above exception was the direct cause of the following exception: 2026-04-02 00:20:48.354 15434 ERROR nova 2026-04-02 00:20:48.354 15434 ERROR nova Traceback (most recent call last): 2026-04-02 00:20:48.354 15434 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:20:48.354 15434 ERROR nova sys.exit(main()) 2026-04-02 00:20:48.354 15434 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:20:48.354 15434 ERROR nova server = service.Service.create( 2026-04-02 00:20:48.354 15434 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:20:48.354 15434 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:20:48.354 15434 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:20:48.354 15434 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:20:48.354 15434 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 66, in __init__ 2026-04-02 00:20:48.354 15434 ERROR nova self.host_manager = host_manager.HostManager() 2026-04-02 00:20:48.354 15434 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/host_manager.py", line 334, in __init__ 2026-04-02 00:20:48.354 15434 ERROR nova self.refresh_cells_caches() 2026-04-02 00:20:48.354 15434 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/host_manager.py", line 718, in refresh_cells_caches 2026-04-02 00:20:48.354 15434 ERROR nova temp_cells = objects.CellMappingList.get_all(context) 2026-04-02 00:20:48.354 15434 ERROR nova File "/usr/lib/python3/dist-packages/oslo_versionedobjects/base.py", line 184, in wrapper 2026-04-02 00:20:48.354 15434 ERROR nova result = fn(cls, context, *args, **kwargs) 2026-04-02 00:20:48.354 15434 ERROR nova File "/usr/lib/python3/dist-packages/nova/objects/cell_mapping.py", line 256, in get_all 2026-04-02 00:20:48.354 15434 ERROR nova db_mappings = cls._get_all_from_db(context) 2026-04-02 00:20:48.354 15434 ERROR nova File "/usr/lib/python3/dist-packages/oslo_db/sqlalchemy/enginefacade.py", line 1010, in wrapper 2026-04-02 00:20:48.354 15434 ERROR nova return fn(*args, **kwargs) 2026-04-02 00:20:48.354 15434 ERROR nova File "/usr/lib/python3/dist-packages/nova/objects/cell_mapping.py", line 252, in _get_all_from_db 2026-04-02 00:20:48.354 15434 ERROR nova expression.asc(api_db_models.CellMapping.id)).all() 2026-04-02 00:20:48.354 15434 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/orm/query.py", line 2759, in all 2026-04-02 00:20:48.354 15434 ERROR nova return self._iter().all() 2026-04-02 00:20:48.354 15434 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/orm/query.py", line 2894, in _iter 2026-04-02 00:20:48.354 15434 ERROR nova result = self.session.execute( 2026-04-02 00:20:48.354 15434 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/orm/session.py", line 1692, in execute 2026-04-02 00:20:48.354 15434 ERROR nova result = conn._execute_20(statement, params or {}, execution_options) 2026-04-02 00:20:48.354 15434 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1614, in _execute_20 2026-04-02 00:20:48.354 15434 ERROR nova return meth(self, args_10style, kwargs_10style, execution_options) 2026-04-02 00:20:48.354 15434 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/sql/elements.py", line 325, in _execute_on_connection 2026-04-02 00:20:48.354 15434 ERROR nova return connection._execute_clauseelement( 2026-04-02 00:20:48.354 15434 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1481, in _execute_clauseelement 2026-04-02 00:20:48.354 15434 ERROR nova ret = self._execute_context( 2026-04-02 00:20:48.354 15434 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1845, in _execute_context 2026-04-02 00:20:48.354 15434 ERROR nova self._handle_dbapi_exception( 2026-04-02 00:20:48.354 15434 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 2024, in _handle_dbapi_exception 2026-04-02 00:20:48.354 15434 ERROR nova util.raise_(newraise, with_traceback=exc_info[2], from_=e) 2026-04-02 00:20:48.354 15434 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/util/compat.py", line 207, in raise_ 2026-04-02 00:20:48.354 15434 ERROR nova raise exception 2026-04-02 00:20:48.354 15434 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1802, in _execute_context 2026-04-02 00:20:48.354 15434 ERROR nova self.dialect.do_execute( 2026-04-02 00:20:48.354 15434 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/default.py", line 732, in do_execute 2026-04-02 00:20:48.354 15434 ERROR nova cursor.execute(statement, parameters) 2026-04-02 00:20:48.354 15434 ERROR nova oslo_db.exception.DBNonExistentTable: (sqlite3.OperationalError) no such table: cell_mappings 2026-04-02 00:20:48.354 15434 ERROR nova [SQL: SELECT cell_mappings.created_at AS cell_mappings_created_at, cell_mappings.updated_at AS cell_mappings_updated_at, cell_mappings.id AS cell_mappings_id, cell_mappings.uuid AS cell_mappings_uuid, cell_mappings.name AS cell_mappings_name, cell_mappings.transport_url AS cell_mappings_transport_url, cell_mappings.database_connection AS cell_mappings_database_connection, cell_mappings.disabled AS cell_mappings_disabled 2026-04-02 00:20:48.354 15434 ERROR nova FROM cell_mappings ORDER BY cell_mappings.id ASC] 2026-04-02 00:20:48.354 15434 ERROR nova (Background on this error at: https://sqlalche.me/e/14/e3q8) 2026-04-02 00:20:48.354 15434 ERROR nova 2026-04-02 00:20:59.746 15550 INFO oslo_service.periodic_task [-] Skipping periodic task _discover_hosts_in_cells because its interval is negative 2026-04-02 00:20:59.935 15550 CRITICAL nova [req-2a7f03db-7dff-4cf1-b8b3-290b14041e91 - - - - -] Unhandled error: oslo_db.exception.DBNonExistentTable: (sqlite3.OperationalError) no such table: cell_mappings [SQL: SELECT cell_mappings.created_at AS cell_mappings_created_at, cell_mappings.updated_at AS cell_mappings_updated_at, cell_mappings.id AS cell_mappings_id, cell_mappings.uuid AS cell_mappings_uuid, cell_mappings.name AS cell_mappings_name, cell_mappings.transport_url AS cell_mappings_transport_url, cell_mappings.database_connection AS cell_mappings_database_connection, cell_mappings.disabled AS cell_mappings_disabled FROM cell_mappings ORDER BY cell_mappings.id ASC] (Background on this error at: https://sqlalche.me/e/14/e3q8) 2026-04-02 00:20:59.935 15550 ERROR nova Traceback (most recent call last): 2026-04-02 00:20:59.935 15550 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1802, in _execute_context 2026-04-02 00:20:59.935 15550 ERROR nova self.dialect.do_execute( 2026-04-02 00:20:59.935 15550 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/default.py", line 732, in do_execute 2026-04-02 00:20:59.935 15550 ERROR nova cursor.execute(statement, parameters) 2026-04-02 00:20:59.935 15550 ERROR nova sqlite3.OperationalError: no such table: cell_mappings 2026-04-02 00:20:59.935 15550 ERROR nova 2026-04-02 00:20:59.935 15550 ERROR nova The above exception was the direct cause of the following exception: 2026-04-02 00:20:59.935 15550 ERROR nova 2026-04-02 00:20:59.935 15550 ERROR nova Traceback (most recent call last): 2026-04-02 00:20:59.935 15550 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:20:59.935 15550 ERROR nova sys.exit(main()) 2026-04-02 00:20:59.935 15550 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:20:59.935 15550 ERROR nova server = service.Service.create( 2026-04-02 00:20:59.935 15550 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:20:59.935 15550 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:20:59.935 15550 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:20:59.935 15550 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:20:59.935 15550 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 66, in __init__ 2026-04-02 00:20:59.935 15550 ERROR nova self.host_manager = host_manager.HostManager() 2026-04-02 00:20:59.935 15550 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/host_manager.py", line 334, in __init__ 2026-04-02 00:20:59.935 15550 ERROR nova self.refresh_cells_caches() 2026-04-02 00:20:59.935 15550 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/host_manager.py", line 718, in refresh_cells_caches 2026-04-02 00:20:59.935 15550 ERROR nova temp_cells = objects.CellMappingList.get_all(context) 2026-04-02 00:20:59.935 15550 ERROR nova File "/usr/lib/python3/dist-packages/oslo_versionedobjects/base.py", line 184, in wrapper 2026-04-02 00:20:59.935 15550 ERROR nova result = fn(cls, context, *args, **kwargs) 2026-04-02 00:20:59.935 15550 ERROR nova File "/usr/lib/python3/dist-packages/nova/objects/cell_mapping.py", line 256, in get_all 2026-04-02 00:20:59.935 15550 ERROR nova db_mappings = cls._get_all_from_db(context) 2026-04-02 00:20:59.935 15550 ERROR nova File "/usr/lib/python3/dist-packages/oslo_db/sqlalchemy/enginefacade.py", line 1010, in wrapper 2026-04-02 00:20:59.935 15550 ERROR nova return fn(*args, **kwargs) 2026-04-02 00:20:59.935 15550 ERROR nova File "/usr/lib/python3/dist-packages/nova/objects/cell_mapping.py", line 252, in _get_all_from_db 2026-04-02 00:20:59.935 15550 ERROR nova expression.asc(api_db_models.CellMapping.id)).all() 2026-04-02 00:20:59.935 15550 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/orm/query.py", line 2759, in all 2026-04-02 00:20:59.935 15550 ERROR nova return self._iter().all() 2026-04-02 00:20:59.935 15550 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/orm/query.py", line 2894, in _iter 2026-04-02 00:20:59.935 15550 ERROR nova result = self.session.execute( 2026-04-02 00:20:59.935 15550 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/orm/session.py", line 1692, in execute 2026-04-02 00:20:59.935 15550 ERROR nova result = conn._execute_20(statement, params or {}, execution_options) 2026-04-02 00:20:59.935 15550 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1614, in _execute_20 2026-04-02 00:20:59.935 15550 ERROR nova return meth(self, args_10style, kwargs_10style, execution_options) 2026-04-02 00:20:59.935 15550 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/sql/elements.py", line 325, in _execute_on_connection 2026-04-02 00:20:59.935 15550 ERROR nova return connection._execute_clauseelement( 2026-04-02 00:20:59.935 15550 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1481, in _execute_clauseelement 2026-04-02 00:20:59.935 15550 ERROR nova ret = self._execute_context( 2026-04-02 00:20:59.935 15550 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1845, in _execute_context 2026-04-02 00:20:59.935 15550 ERROR nova self._handle_dbapi_exception( 2026-04-02 00:20:59.935 15550 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 2024, in _handle_dbapi_exception 2026-04-02 00:20:59.935 15550 ERROR nova util.raise_(newraise, with_traceback=exc_info[2], from_=e) 2026-04-02 00:20:59.935 15550 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/util/compat.py", line 207, in raise_ 2026-04-02 00:20:59.935 15550 ERROR nova raise exception 2026-04-02 00:20:59.935 15550 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/base.py", line 1802, in _execute_context 2026-04-02 00:20:59.935 15550 ERROR nova self.dialect.do_execute( 2026-04-02 00:20:59.935 15550 ERROR nova File "/usr/lib/python3/dist-packages/sqlalchemy/engine/default.py", line 732, in do_execute 2026-04-02 00:20:59.935 15550 ERROR nova cursor.execute(statement, parameters) 2026-04-02 00:20:59.935 15550 ERROR nova oslo_db.exception.DBNonExistentTable: (sqlite3.OperationalError) no such table: cell_mappings 2026-04-02 00:20:59.935 15550 ERROR nova [SQL: SELECT cell_mappings.created_at AS cell_mappings_created_at, cell_mappings.updated_at AS cell_mappings_updated_at, cell_mappings.id AS cell_mappings_id, cell_mappings.uuid AS cell_mappings_uuid, cell_mappings.name AS cell_mappings_name, cell_mappings.transport_url AS cell_mappings_transport_url, cell_mappings.database_connection AS cell_mappings_database_connection, cell_mappings.disabled AS cell_mappings_disabled 2026-04-02 00:20:59.935 15550 ERROR nova FROM cell_mappings ORDER BY cell_mappings.id ASC] 2026-04-02 00:20:59.935 15550 ERROR nova (Background on this error at: https://sqlalche.me/e/14/e3q8) 2026-04-02 00:20:59.935 15550 ERROR nova 2026-04-02 00:35:47.552 110810 DEBUG oslo_db.sqlalchemy.engines [req-380cf801-c873-44c4-8692-a4501f8e5f1e - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:35:47.587 110810 DEBUG nova.scheduler.host_manager [req-380cf801-c873-44c4-8692-a4501f8e5f1e - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:35:47.588 110810 DEBUG nova.scheduler.host_manager [req-380cf801-c873-44c4-8692-a4501f8e5f1e - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:35:47.592 110810 WARNING nova.scheduler.filters.availability_zone_filter [req-380cf801-c873-44c4-8692-a4501f8e5f1e - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:35:47.676 110810 ERROR nova.scheduler.client.report [req-380cf801-c873-44c4-8692-a4501f8e5f1e - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:35:47.676 110810 CRITICAL nova [req-380cf801-c873-44c4-8692-a4501f8e5f1e - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:35:47.676 110810 ERROR nova Traceback (most recent call last): 2026-04-02 00:35:47.676 110810 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:35:47.676 110810 ERROR nova sys.exit(main()) 2026-04-02 00:35:47.676 110810 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:35:47.676 110810 ERROR nova server = service.Service.create( 2026-04-02 00:35:47.676 110810 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:35:47.676 110810 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:35:47.676 110810 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:35:47.676 110810 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:35:47.676 110810 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:35:47.676 110810 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:35:47.676 110810 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:35:47.676 110810 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:35:47.676 110810 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:35:47.676 110810 ERROR nova self._client = self._create_client() 2026-04-02 00:35:47.676 110810 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:35:47.676 110810 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:35:47.676 110810 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:35:47.676 110810 ERROR nova return getattr(conn, service_type) 2026-04-02 00:35:47.676 110810 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:35:47.676 110810 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:35:47.676 110810 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:35:47.676 110810 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:35:47.676 110810 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:35:47.676 110810 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:35:47.676 110810 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:35:47.676 110810 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:35:47.676 110810 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:35:47.676 110810 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:35:47.676 110810 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:35:47.676 110810 ERROR nova 2026-04-02 00:35:49.894 111094 DEBUG oslo_db.sqlalchemy.engines [req-24344b91-d6d5-499e-9755-8251ea94d61c - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:35:49.927 111094 DEBUG nova.scheduler.host_manager [req-24344b91-d6d5-499e-9755-8251ea94d61c - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:35:49.927 111094 DEBUG nova.scheduler.host_manager [req-24344b91-d6d5-499e-9755-8251ea94d61c - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:35:49.931 111094 WARNING nova.scheduler.filters.availability_zone_filter [req-24344b91-d6d5-499e-9755-8251ea94d61c - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:35:50.021 111094 ERROR nova.scheduler.client.report [req-24344b91-d6d5-499e-9755-8251ea94d61c - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:35:50.021 111094 CRITICAL nova [req-24344b91-d6d5-499e-9755-8251ea94d61c - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:35:50.021 111094 ERROR nova Traceback (most recent call last): 2026-04-02 00:35:50.021 111094 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:35:50.021 111094 ERROR nova sys.exit(main()) 2026-04-02 00:35:50.021 111094 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:35:50.021 111094 ERROR nova server = service.Service.create( 2026-04-02 00:35:50.021 111094 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:35:50.021 111094 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:35:50.021 111094 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:35:50.021 111094 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:35:50.021 111094 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:35:50.021 111094 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:35:50.021 111094 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:35:50.021 111094 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:35:50.021 111094 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:35:50.021 111094 ERROR nova self._client = self._create_client() 2026-04-02 00:35:50.021 111094 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:35:50.021 111094 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:35:50.021 111094 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:35:50.021 111094 ERROR nova return getattr(conn, service_type) 2026-04-02 00:35:50.021 111094 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:35:50.021 111094 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:35:50.021 111094 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:35:50.021 111094 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:35:50.021 111094 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:35:50.021 111094 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:35:50.021 111094 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:35:50.021 111094 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:35:50.021 111094 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:35:50.021 111094 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:35:50.021 111094 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:35:50.021 111094 ERROR nova 2026-04-02 00:35:52.194 111106 DEBUG oslo_db.sqlalchemy.engines [req-6b7c046d-e704-4c17-a72c-2330051f12b6 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:35:52.242 111106 DEBUG nova.scheduler.host_manager [req-6b7c046d-e704-4c17-a72c-2330051f12b6 - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:35:52.242 111106 DEBUG nova.scheduler.host_manager [req-6b7c046d-e704-4c17-a72c-2330051f12b6 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:35:52.246 111106 WARNING nova.scheduler.filters.availability_zone_filter [req-6b7c046d-e704-4c17-a72c-2330051f12b6 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:35:52.336 111106 ERROR nova.scheduler.client.report [req-6b7c046d-e704-4c17-a72c-2330051f12b6 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:35:52.337 111106 CRITICAL nova [req-6b7c046d-e704-4c17-a72c-2330051f12b6 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:35:52.337 111106 ERROR nova Traceback (most recent call last): 2026-04-02 00:35:52.337 111106 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:35:52.337 111106 ERROR nova sys.exit(main()) 2026-04-02 00:35:52.337 111106 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:35:52.337 111106 ERROR nova server = service.Service.create( 2026-04-02 00:35:52.337 111106 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:35:52.337 111106 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:35:52.337 111106 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:35:52.337 111106 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:35:52.337 111106 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:35:52.337 111106 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:35:52.337 111106 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:35:52.337 111106 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:35:52.337 111106 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:35:52.337 111106 ERROR nova self._client = self._create_client() 2026-04-02 00:35:52.337 111106 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:35:52.337 111106 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:35:52.337 111106 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:35:52.337 111106 ERROR nova return getattr(conn, service_type) 2026-04-02 00:35:52.337 111106 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:35:52.337 111106 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:35:52.337 111106 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:35:52.337 111106 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:35:52.337 111106 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:35:52.337 111106 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:35:52.337 111106 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:35:52.337 111106 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:35:52.337 111106 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:35:52.337 111106 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:35:52.337 111106 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:35:52.337 111106 ERROR nova 2026-04-02 00:35:55.032 111132 DEBUG oslo_db.sqlalchemy.engines [req-4925ed0a-46c6-4e91-8fe0-67f149cf1b4c - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:35:55.088 111132 DEBUG nova.scheduler.host_manager [req-4925ed0a-46c6-4e91-8fe0-67f149cf1b4c - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:35:55.089 111132 DEBUG nova.scheduler.host_manager [req-4925ed0a-46c6-4e91-8fe0-67f149cf1b4c - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:35:55.094 111132 WARNING nova.scheduler.filters.availability_zone_filter [req-4925ed0a-46c6-4e91-8fe0-67f149cf1b4c - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:35:55.195 111132 ERROR nova.scheduler.client.report [req-4925ed0a-46c6-4e91-8fe0-67f149cf1b4c - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:35:55.195 111132 CRITICAL nova [req-4925ed0a-46c6-4e91-8fe0-67f149cf1b4c - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:35:55.195 111132 ERROR nova Traceback (most recent call last): 2026-04-02 00:35:55.195 111132 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:35:55.195 111132 ERROR nova sys.exit(main()) 2026-04-02 00:35:55.195 111132 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:35:55.195 111132 ERROR nova server = service.Service.create( 2026-04-02 00:35:55.195 111132 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:35:55.195 111132 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:35:55.195 111132 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:35:55.195 111132 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:35:55.195 111132 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:35:55.195 111132 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:35:55.195 111132 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:35:55.195 111132 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:35:55.195 111132 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:35:55.195 111132 ERROR nova self._client = self._create_client() 2026-04-02 00:35:55.195 111132 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:35:55.195 111132 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:35:55.195 111132 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:35:55.195 111132 ERROR nova return getattr(conn, service_type) 2026-04-02 00:35:55.195 111132 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:35:55.195 111132 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:35:55.195 111132 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:35:55.195 111132 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:35:55.195 111132 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:35:55.195 111132 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:35:55.195 111132 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:35:55.195 111132 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:35:55.195 111132 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:35:55.195 111132 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:35:55.195 111132 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:35:55.195 111132 ERROR nova 2026-04-02 00:35:57.561 111144 DEBUG oslo_db.sqlalchemy.engines [req-c1953d5b-0baf-439e-8849-6c6f67af0a8d - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:35:57.647 111144 DEBUG nova.scheduler.host_manager [req-c1953d5b-0baf-439e-8849-6c6f67af0a8d - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:35:57.647 111144 DEBUG nova.scheduler.host_manager [req-c1953d5b-0baf-439e-8849-6c6f67af0a8d - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:35:57.651 111144 WARNING nova.scheduler.filters.availability_zone_filter [req-c1953d5b-0baf-439e-8849-6c6f67af0a8d - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:35:57.739 111144 ERROR nova.scheduler.client.report [req-c1953d5b-0baf-439e-8849-6c6f67af0a8d - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:35:57.740 111144 CRITICAL nova [req-c1953d5b-0baf-439e-8849-6c6f67af0a8d - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:35:57.740 111144 ERROR nova Traceback (most recent call last): 2026-04-02 00:35:57.740 111144 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:35:57.740 111144 ERROR nova sys.exit(main()) 2026-04-02 00:35:57.740 111144 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:35:57.740 111144 ERROR nova server = service.Service.create( 2026-04-02 00:35:57.740 111144 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:35:57.740 111144 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:35:57.740 111144 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:35:57.740 111144 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:35:57.740 111144 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:35:57.740 111144 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:35:57.740 111144 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:35:57.740 111144 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:35:57.740 111144 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:35:57.740 111144 ERROR nova self._client = self._create_client() 2026-04-02 00:35:57.740 111144 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:35:57.740 111144 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:35:57.740 111144 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:35:57.740 111144 ERROR nova return getattr(conn, service_type) 2026-04-02 00:35:57.740 111144 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:35:57.740 111144 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:35:57.740 111144 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:35:57.740 111144 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:35:57.740 111144 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:35:57.740 111144 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:35:57.740 111144 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:35:57.740 111144 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:35:57.740 111144 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:35:57.740 111144 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:35:57.740 111144 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:35:57.740 111144 ERROR nova 2026-04-02 00:35:59.673 111160 DEBUG oslo_db.sqlalchemy.engines [req-8cea866f-60da-4e48-aeb8-dfa48e4c21f0 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:35:59.715 111160 DEBUG nova.scheduler.host_manager [req-8cea866f-60da-4e48-aeb8-dfa48e4c21f0 - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:35:59.715 111160 DEBUG nova.scheduler.host_manager [req-8cea866f-60da-4e48-aeb8-dfa48e4c21f0 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:35:59.720 111160 WARNING nova.scheduler.filters.availability_zone_filter [req-8cea866f-60da-4e48-aeb8-dfa48e4c21f0 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:35:59.821 111160 ERROR nova.scheduler.client.report [req-8cea866f-60da-4e48-aeb8-dfa48e4c21f0 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:35:59.822 111160 CRITICAL nova [req-8cea866f-60da-4e48-aeb8-dfa48e4c21f0 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:35:59.822 111160 ERROR nova Traceback (most recent call last): 2026-04-02 00:35:59.822 111160 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:35:59.822 111160 ERROR nova sys.exit(main()) 2026-04-02 00:35:59.822 111160 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:35:59.822 111160 ERROR nova server = service.Service.create( 2026-04-02 00:35:59.822 111160 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:35:59.822 111160 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:35:59.822 111160 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:35:59.822 111160 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:35:59.822 111160 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:35:59.822 111160 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:35:59.822 111160 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:35:59.822 111160 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:35:59.822 111160 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:35:59.822 111160 ERROR nova self._client = self._create_client() 2026-04-02 00:35:59.822 111160 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:35:59.822 111160 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:35:59.822 111160 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:35:59.822 111160 ERROR nova return getattr(conn, service_type) 2026-04-02 00:35:59.822 111160 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:35:59.822 111160 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:35:59.822 111160 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:35:59.822 111160 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:35:59.822 111160 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:35:59.822 111160 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:35:59.822 111160 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:35:59.822 111160 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:35:59.822 111160 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:35:59.822 111160 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:35:59.822 111160 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:35:59.822 111160 ERROR nova 2026-04-02 00:36:02.026 111482 DEBUG oslo_db.sqlalchemy.engines [req-ce5ede78-b60a-448a-a26b-293d4235867e - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:36:02.061 111482 DEBUG nova.scheduler.host_manager [req-ce5ede78-b60a-448a-a26b-293d4235867e - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:36:02.062 111482 DEBUG nova.scheduler.host_manager [req-ce5ede78-b60a-448a-a26b-293d4235867e - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:36:02.066 111482 WARNING nova.scheduler.filters.availability_zone_filter [req-ce5ede78-b60a-448a-a26b-293d4235867e - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:36:02.155 111482 ERROR nova.scheduler.client.report [req-ce5ede78-b60a-448a-a26b-293d4235867e - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:02.157 111482 CRITICAL nova [req-ce5ede78-b60a-448a-a26b-293d4235867e - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:02.157 111482 ERROR nova Traceback (most recent call last): 2026-04-02 00:36:02.157 111482 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:36:02.157 111482 ERROR nova sys.exit(main()) 2026-04-02 00:36:02.157 111482 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:36:02.157 111482 ERROR nova server = service.Service.create( 2026-04-02 00:36:02.157 111482 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:36:02.157 111482 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:36:02.157 111482 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:36:02.157 111482 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:36:02.157 111482 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:36:02.157 111482 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:36:02.157 111482 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:36:02.157 111482 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:36:02.157 111482 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:36:02.157 111482 ERROR nova self._client = self._create_client() 2026-04-02 00:36:02.157 111482 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:36:02.157 111482 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:36:02.157 111482 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:36:02.157 111482 ERROR nova return getattr(conn, service_type) 2026-04-02 00:36:02.157 111482 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:36:02.157 111482 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:36:02.157 111482 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:36:02.157 111482 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:36:02.157 111482 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:36:02.157 111482 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:36:02.157 111482 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:36:02.157 111482 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:36:02.157 111482 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:36:02.157 111482 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:36:02.157 111482 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:02.157 111482 ERROR nova 2026-04-02 00:36:04.667 111957 DEBUG oslo_db.sqlalchemy.engines [req-713a84b2-6d6d-4aa9-bd3e-fc9bf3c9d52c - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:36:04.742 111957 DEBUG nova.scheduler.host_manager [req-713a84b2-6d6d-4aa9-bd3e-fc9bf3c9d52c - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:36:04.742 111957 DEBUG nova.scheduler.host_manager [req-713a84b2-6d6d-4aa9-bd3e-fc9bf3c9d52c - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:36:04.747 111957 WARNING nova.scheduler.filters.availability_zone_filter [req-713a84b2-6d6d-4aa9-bd3e-fc9bf3c9d52c - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:36:04.841 111957 ERROR nova.scheduler.client.report [req-713a84b2-6d6d-4aa9-bd3e-fc9bf3c9d52c - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:04.842 111957 CRITICAL nova [req-713a84b2-6d6d-4aa9-bd3e-fc9bf3c9d52c - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:04.842 111957 ERROR nova Traceback (most recent call last): 2026-04-02 00:36:04.842 111957 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:36:04.842 111957 ERROR nova sys.exit(main()) 2026-04-02 00:36:04.842 111957 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:36:04.842 111957 ERROR nova server = service.Service.create( 2026-04-02 00:36:04.842 111957 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:36:04.842 111957 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:36:04.842 111957 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:36:04.842 111957 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:36:04.842 111957 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:36:04.842 111957 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:36:04.842 111957 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:36:04.842 111957 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:36:04.842 111957 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:36:04.842 111957 ERROR nova self._client = self._create_client() 2026-04-02 00:36:04.842 111957 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:36:04.842 111957 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:36:04.842 111957 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:36:04.842 111957 ERROR nova return getattr(conn, service_type) 2026-04-02 00:36:04.842 111957 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:36:04.842 111957 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:36:04.842 111957 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:36:04.842 111957 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:36:04.842 111957 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:36:04.842 111957 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:36:04.842 111957 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:36:04.842 111957 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:36:04.842 111957 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:36:04.842 111957 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:36:04.842 111957 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:04.842 111957 ERROR nova 2026-04-02 00:36:06.945 112650 DEBUG oslo_db.sqlalchemy.engines [req-91438270-f110-4699-a628-b97c9bebb30c - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:36:06.975 112650 DEBUG nova.scheduler.host_manager [req-91438270-f110-4699-a628-b97c9bebb30c - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:36:06.976 112650 DEBUG nova.scheduler.host_manager [req-91438270-f110-4699-a628-b97c9bebb30c - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:36:06.980 112650 WARNING nova.scheduler.filters.availability_zone_filter [req-91438270-f110-4699-a628-b97c9bebb30c - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:36:07.059 112650 ERROR nova.scheduler.client.report [req-91438270-f110-4699-a628-b97c9bebb30c - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:07.060 112650 CRITICAL nova [req-91438270-f110-4699-a628-b97c9bebb30c - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:07.060 112650 ERROR nova Traceback (most recent call last): 2026-04-02 00:36:07.060 112650 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:36:07.060 112650 ERROR nova sys.exit(main()) 2026-04-02 00:36:07.060 112650 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:36:07.060 112650 ERROR nova server = service.Service.create( 2026-04-02 00:36:07.060 112650 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:36:07.060 112650 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:36:07.060 112650 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:36:07.060 112650 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:36:07.060 112650 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:36:07.060 112650 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:36:07.060 112650 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:36:07.060 112650 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:36:07.060 112650 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:36:07.060 112650 ERROR nova self._client = self._create_client() 2026-04-02 00:36:07.060 112650 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:36:07.060 112650 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:36:07.060 112650 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:36:07.060 112650 ERROR nova return getattr(conn, service_type) 2026-04-02 00:36:07.060 112650 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:36:07.060 112650 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:36:07.060 112650 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:36:07.060 112650 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:36:07.060 112650 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:36:07.060 112650 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:36:07.060 112650 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:36:07.060 112650 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:36:07.060 112650 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:36:07.060 112650 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:36:07.060 112650 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:07.060 112650 ERROR nova 2026-04-02 00:36:09.322 113313 DEBUG oslo_db.sqlalchemy.engines [req-1091af45-6755-4939-bd14-f4c7bdceea15 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:36:09.361 113313 DEBUG nova.scheduler.host_manager [req-1091af45-6755-4939-bd14-f4c7bdceea15 - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:36:09.362 113313 DEBUG nova.scheduler.host_manager [req-1091af45-6755-4939-bd14-f4c7bdceea15 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:36:09.367 113313 WARNING nova.scheduler.filters.availability_zone_filter [req-1091af45-6755-4939-bd14-f4c7bdceea15 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:36:09.464 113313 ERROR nova.scheduler.client.report [req-1091af45-6755-4939-bd14-f4c7bdceea15 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:09.465 113313 CRITICAL nova [req-1091af45-6755-4939-bd14-f4c7bdceea15 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:09.465 113313 ERROR nova Traceback (most recent call last): 2026-04-02 00:36:09.465 113313 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:36:09.465 113313 ERROR nova sys.exit(main()) 2026-04-02 00:36:09.465 113313 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:36:09.465 113313 ERROR nova server = service.Service.create( 2026-04-02 00:36:09.465 113313 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:36:09.465 113313 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:36:09.465 113313 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:36:09.465 113313 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:36:09.465 113313 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:36:09.465 113313 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:36:09.465 113313 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:36:09.465 113313 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:36:09.465 113313 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:36:09.465 113313 ERROR nova self._client = self._create_client() 2026-04-02 00:36:09.465 113313 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:36:09.465 113313 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:36:09.465 113313 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:36:09.465 113313 ERROR nova return getattr(conn, service_type) 2026-04-02 00:36:09.465 113313 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:36:09.465 113313 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:36:09.465 113313 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:36:09.465 113313 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:36:09.465 113313 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:36:09.465 113313 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:36:09.465 113313 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:36:09.465 113313 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:36:09.465 113313 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:36:09.465 113313 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:36:09.465 113313 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:09.465 113313 ERROR nova 2026-04-02 00:36:11.905 114271 DEBUG oslo_db.sqlalchemy.engines [req-61711bb1-3578-4974-bc36-7aa2189b72cd - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:36:11.944 114271 DEBUG nova.scheduler.host_manager [req-61711bb1-3578-4974-bc36-7aa2189b72cd - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:36:11.945 114271 DEBUG nova.scheduler.host_manager [req-61711bb1-3578-4974-bc36-7aa2189b72cd - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:36:11.950 114271 WARNING nova.scheduler.filters.availability_zone_filter [req-61711bb1-3578-4974-bc36-7aa2189b72cd - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:36:12.039 114271 ERROR nova.scheduler.client.report [req-61711bb1-3578-4974-bc36-7aa2189b72cd - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:12.040 114271 CRITICAL nova [req-61711bb1-3578-4974-bc36-7aa2189b72cd - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:12.040 114271 ERROR nova Traceback (most recent call last): 2026-04-02 00:36:12.040 114271 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:36:12.040 114271 ERROR nova sys.exit(main()) 2026-04-02 00:36:12.040 114271 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:36:12.040 114271 ERROR nova server = service.Service.create( 2026-04-02 00:36:12.040 114271 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:36:12.040 114271 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:36:12.040 114271 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:36:12.040 114271 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:36:12.040 114271 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:36:12.040 114271 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:36:12.040 114271 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:36:12.040 114271 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:36:12.040 114271 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:36:12.040 114271 ERROR nova self._client = self._create_client() 2026-04-02 00:36:12.040 114271 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:36:12.040 114271 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:36:12.040 114271 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:36:12.040 114271 ERROR nova return getattr(conn, service_type) 2026-04-02 00:36:12.040 114271 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:36:12.040 114271 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:36:12.040 114271 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:36:12.040 114271 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:36:12.040 114271 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:36:12.040 114271 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:36:12.040 114271 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:36:12.040 114271 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:36:12.040 114271 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:36:12.040 114271 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:36:12.040 114271 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:12.040 114271 ERROR nova 2026-04-02 00:36:14.205 115401 DEBUG oslo_db.sqlalchemy.engines [req-7e9f577e-890e-4c0f-87ce-64bce3716ec0 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:36:14.245 115401 DEBUG nova.scheduler.host_manager [req-7e9f577e-890e-4c0f-87ce-64bce3716ec0 - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:36:14.246 115401 DEBUG nova.scheduler.host_manager [req-7e9f577e-890e-4c0f-87ce-64bce3716ec0 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:36:14.251 115401 WARNING nova.scheduler.filters.availability_zone_filter [req-7e9f577e-890e-4c0f-87ce-64bce3716ec0 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:36:14.353 115401 ERROR nova.scheduler.client.report [req-7e9f577e-890e-4c0f-87ce-64bce3716ec0 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:14.354 115401 CRITICAL nova [req-7e9f577e-890e-4c0f-87ce-64bce3716ec0 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:14.354 115401 ERROR nova Traceback (most recent call last): 2026-04-02 00:36:14.354 115401 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:36:14.354 115401 ERROR nova sys.exit(main()) 2026-04-02 00:36:14.354 115401 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:36:14.354 115401 ERROR nova server = service.Service.create( 2026-04-02 00:36:14.354 115401 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:36:14.354 115401 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:36:14.354 115401 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:36:14.354 115401 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:36:14.354 115401 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:36:14.354 115401 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:36:14.354 115401 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:36:14.354 115401 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:36:14.354 115401 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:36:14.354 115401 ERROR nova self._client = self._create_client() 2026-04-02 00:36:14.354 115401 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:36:14.354 115401 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:36:14.354 115401 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:36:14.354 115401 ERROR nova return getattr(conn, service_type) 2026-04-02 00:36:14.354 115401 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:36:14.354 115401 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:36:14.354 115401 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:36:14.354 115401 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:36:14.354 115401 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:36:14.354 115401 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:36:14.354 115401 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:36:14.354 115401 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:36:14.354 115401 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:36:14.354 115401 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:36:14.354 115401 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:14.354 115401 ERROR nova 2026-04-02 00:36:16.746 116405 DEBUG oslo_db.sqlalchemy.engines [req-83f02bcf-150a-4675-9b22-100c8d8e5ade - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:36:16.779 116405 DEBUG nova.scheduler.host_manager [req-83f02bcf-150a-4675-9b22-100c8d8e5ade - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:36:16.780 116405 DEBUG nova.scheduler.host_manager [req-83f02bcf-150a-4675-9b22-100c8d8e5ade - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:36:16.784 116405 WARNING nova.scheduler.filters.availability_zone_filter [req-83f02bcf-150a-4675-9b22-100c8d8e5ade - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:36:16.868 116405 ERROR nova.scheduler.client.report [req-83f02bcf-150a-4675-9b22-100c8d8e5ade - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:16.868 116405 CRITICAL nova [req-83f02bcf-150a-4675-9b22-100c8d8e5ade - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:16.868 116405 ERROR nova Traceback (most recent call last): 2026-04-02 00:36:16.868 116405 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:36:16.868 116405 ERROR nova sys.exit(main()) 2026-04-02 00:36:16.868 116405 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:36:16.868 116405 ERROR nova server = service.Service.create( 2026-04-02 00:36:16.868 116405 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:36:16.868 116405 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:36:16.868 116405 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:36:16.868 116405 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:36:16.868 116405 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:36:16.868 116405 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:36:16.868 116405 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:36:16.868 116405 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:36:16.868 116405 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:36:16.868 116405 ERROR nova self._client = self._create_client() 2026-04-02 00:36:16.868 116405 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:36:16.868 116405 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:36:16.868 116405 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:36:16.868 116405 ERROR nova return getattr(conn, service_type) 2026-04-02 00:36:16.868 116405 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:36:16.868 116405 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:36:16.868 116405 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:36:16.868 116405 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:36:16.868 116405 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:36:16.868 116405 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:36:16.868 116405 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:36:16.868 116405 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:36:16.868 116405 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:36:16.868 116405 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:36:16.868 116405 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:16.868 116405 ERROR nova 2026-04-02 00:36:19.120 117087 DEBUG oslo_db.sqlalchemy.engines [req-265a454c-477e-4331-98f2-7b5e569ec255 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:36:19.154 117087 DEBUG nova.scheduler.host_manager [req-265a454c-477e-4331-98f2-7b5e569ec255 - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:36:19.154 117087 DEBUG nova.scheduler.host_manager [req-265a454c-477e-4331-98f2-7b5e569ec255 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:36:19.158 117087 WARNING nova.scheduler.filters.availability_zone_filter [req-265a454c-477e-4331-98f2-7b5e569ec255 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:36:19.240 117087 ERROR nova.scheduler.client.report [req-265a454c-477e-4331-98f2-7b5e569ec255 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:19.241 117087 CRITICAL nova [req-265a454c-477e-4331-98f2-7b5e569ec255 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:19.241 117087 ERROR nova Traceback (most recent call last): 2026-04-02 00:36:19.241 117087 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:36:19.241 117087 ERROR nova sys.exit(main()) 2026-04-02 00:36:19.241 117087 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:36:19.241 117087 ERROR nova server = service.Service.create( 2026-04-02 00:36:19.241 117087 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:36:19.241 117087 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:36:19.241 117087 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:36:19.241 117087 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:36:19.241 117087 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:36:19.241 117087 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:36:19.241 117087 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:36:19.241 117087 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:36:19.241 117087 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:36:19.241 117087 ERROR nova self._client = self._create_client() 2026-04-02 00:36:19.241 117087 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:36:19.241 117087 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:36:19.241 117087 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:36:19.241 117087 ERROR nova return getattr(conn, service_type) 2026-04-02 00:36:19.241 117087 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:36:19.241 117087 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:36:19.241 117087 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:36:19.241 117087 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:36:19.241 117087 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:36:19.241 117087 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:36:19.241 117087 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:36:19.241 117087 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:36:19.241 117087 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:36:19.241 117087 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:36:19.241 117087 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:19.241 117087 ERROR nova 2026-04-02 00:36:21.466 117512 DEBUG oslo_db.sqlalchemy.engines [req-116856ed-5ed5-4d57-a210-ad20707310b7 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:36:21.496 117512 DEBUG nova.scheduler.host_manager [req-116856ed-5ed5-4d57-a210-ad20707310b7 - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:36:21.497 117512 DEBUG nova.scheduler.host_manager [req-116856ed-5ed5-4d57-a210-ad20707310b7 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:36:21.500 117512 WARNING nova.scheduler.filters.availability_zone_filter [req-116856ed-5ed5-4d57-a210-ad20707310b7 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:36:21.586 117512 ERROR nova.scheduler.client.report [req-116856ed-5ed5-4d57-a210-ad20707310b7 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:21.587 117512 CRITICAL nova [req-116856ed-5ed5-4d57-a210-ad20707310b7 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:21.587 117512 ERROR nova Traceback (most recent call last): 2026-04-02 00:36:21.587 117512 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:36:21.587 117512 ERROR nova sys.exit(main()) 2026-04-02 00:36:21.587 117512 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:36:21.587 117512 ERROR nova server = service.Service.create( 2026-04-02 00:36:21.587 117512 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:36:21.587 117512 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:36:21.587 117512 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:36:21.587 117512 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:36:21.587 117512 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:36:21.587 117512 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:36:21.587 117512 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:36:21.587 117512 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:36:21.587 117512 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:36:21.587 117512 ERROR nova self._client = self._create_client() 2026-04-02 00:36:21.587 117512 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:36:21.587 117512 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:36:21.587 117512 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:36:21.587 117512 ERROR nova return getattr(conn, service_type) 2026-04-02 00:36:21.587 117512 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:36:21.587 117512 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:36:21.587 117512 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:36:21.587 117512 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:36:21.587 117512 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:36:21.587 117512 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:36:21.587 117512 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:36:21.587 117512 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:36:21.587 117512 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:36:21.587 117512 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:36:21.587 117512 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:21.587 117512 ERROR nova 2026-04-02 00:36:23.904 118273 DEBUG oslo_db.sqlalchemy.engines [req-963adc4f-c22f-4e5e-a41c-06e6c216da94 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:36:23.937 118273 DEBUG nova.scheduler.host_manager [req-963adc4f-c22f-4e5e-a41c-06e6c216da94 - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:36:23.937 118273 DEBUG nova.scheduler.host_manager [req-963adc4f-c22f-4e5e-a41c-06e6c216da94 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:36:23.941 118273 WARNING nova.scheduler.filters.availability_zone_filter [req-963adc4f-c22f-4e5e-a41c-06e6c216da94 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:36:24.023 118273 ERROR nova.scheduler.client.report [req-963adc4f-c22f-4e5e-a41c-06e6c216da94 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:24.023 118273 CRITICAL nova [req-963adc4f-c22f-4e5e-a41c-06e6c216da94 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:24.023 118273 ERROR nova Traceback (most recent call last): 2026-04-02 00:36:24.023 118273 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:36:24.023 118273 ERROR nova sys.exit(main()) 2026-04-02 00:36:24.023 118273 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:36:24.023 118273 ERROR nova server = service.Service.create( 2026-04-02 00:36:24.023 118273 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:36:24.023 118273 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:36:24.023 118273 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:36:24.023 118273 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:36:24.023 118273 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:36:24.023 118273 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:36:24.023 118273 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:36:24.023 118273 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:36:24.023 118273 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:36:24.023 118273 ERROR nova self._client = self._create_client() 2026-04-02 00:36:24.023 118273 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:36:24.023 118273 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:36:24.023 118273 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:36:24.023 118273 ERROR nova return getattr(conn, service_type) 2026-04-02 00:36:24.023 118273 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:36:24.023 118273 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:36:24.023 118273 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:36:24.023 118273 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:36:24.023 118273 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:36:24.023 118273 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:36:24.023 118273 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:36:24.023 118273 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:36:24.023 118273 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:36:24.023 118273 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:36:24.023 118273 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:24.023 118273 ERROR nova 2026-04-02 00:36:26.242 119022 DEBUG oslo_db.sqlalchemy.engines [req-4e2aef8d-0007-4f9a-a6e0-e8acefaf9d15 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:36:26.284 119022 DEBUG nova.scheduler.host_manager [req-4e2aef8d-0007-4f9a-a6e0-e8acefaf9d15 - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:36:26.285 119022 DEBUG nova.scheduler.host_manager [req-4e2aef8d-0007-4f9a-a6e0-e8acefaf9d15 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:36:26.290 119022 WARNING nova.scheduler.filters.availability_zone_filter [req-4e2aef8d-0007-4f9a-a6e0-e8acefaf9d15 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:36:26.405 119022 ERROR nova.scheduler.client.report [req-4e2aef8d-0007-4f9a-a6e0-e8acefaf9d15 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:26.406 119022 CRITICAL nova [req-4e2aef8d-0007-4f9a-a6e0-e8acefaf9d15 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:26.406 119022 ERROR nova Traceback (most recent call last): 2026-04-02 00:36:26.406 119022 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:36:26.406 119022 ERROR nova sys.exit(main()) 2026-04-02 00:36:26.406 119022 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:36:26.406 119022 ERROR nova server = service.Service.create( 2026-04-02 00:36:26.406 119022 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:36:26.406 119022 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:36:26.406 119022 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:36:26.406 119022 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:36:26.406 119022 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:36:26.406 119022 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:36:26.406 119022 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:36:26.406 119022 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:36:26.406 119022 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:36:26.406 119022 ERROR nova self._client = self._create_client() 2026-04-02 00:36:26.406 119022 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:36:26.406 119022 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:36:26.406 119022 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:36:26.406 119022 ERROR nova return getattr(conn, service_type) 2026-04-02 00:36:26.406 119022 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:36:26.406 119022 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:36:26.406 119022 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:36:26.406 119022 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:36:26.406 119022 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:36:26.406 119022 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:36:26.406 119022 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:36:26.406 119022 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:36:26.406 119022 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:36:26.406 119022 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:36:26.406 119022 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:26.406 119022 ERROR nova 2026-04-02 00:36:28.688 120098 DEBUG oslo_db.sqlalchemy.engines [req-1695ecc1-2e25-4a24-98fd-fb16b689d81c - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:36:28.719 120098 DEBUG nova.scheduler.host_manager [req-1695ecc1-2e25-4a24-98fd-fb16b689d81c - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:36:28.720 120098 DEBUG nova.scheduler.host_manager [req-1695ecc1-2e25-4a24-98fd-fb16b689d81c - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:36:28.725 120098 WARNING nova.scheduler.filters.availability_zone_filter [req-1695ecc1-2e25-4a24-98fd-fb16b689d81c - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:36:28.812 120098 ERROR nova.scheduler.client.report [req-1695ecc1-2e25-4a24-98fd-fb16b689d81c - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:28.813 120098 CRITICAL nova [req-1695ecc1-2e25-4a24-98fd-fb16b689d81c - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:28.813 120098 ERROR nova Traceback (most recent call last): 2026-04-02 00:36:28.813 120098 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:36:28.813 120098 ERROR nova sys.exit(main()) 2026-04-02 00:36:28.813 120098 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:36:28.813 120098 ERROR nova server = service.Service.create( 2026-04-02 00:36:28.813 120098 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:36:28.813 120098 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:36:28.813 120098 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:36:28.813 120098 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:36:28.813 120098 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:36:28.813 120098 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:36:28.813 120098 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:36:28.813 120098 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:36:28.813 120098 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:36:28.813 120098 ERROR nova self._client = self._create_client() 2026-04-02 00:36:28.813 120098 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:36:28.813 120098 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:36:28.813 120098 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:36:28.813 120098 ERROR nova return getattr(conn, service_type) 2026-04-02 00:36:28.813 120098 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:36:28.813 120098 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:36:28.813 120098 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:36:28.813 120098 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:36:28.813 120098 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:36:28.813 120098 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:36:28.813 120098 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:36:28.813 120098 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:36:28.813 120098 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:36:28.813 120098 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:36:28.813 120098 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:28.813 120098 ERROR nova 2026-04-02 00:36:31.102 121086 DEBUG oslo_db.sqlalchemy.engines [req-8f6fda54-f899-466d-8ac3-8b0395f22a22 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:36:31.139 121086 DEBUG nova.scheduler.host_manager [req-8f6fda54-f899-466d-8ac3-8b0395f22a22 - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:36:31.140 121086 DEBUG nova.scheduler.host_manager [req-8f6fda54-f899-466d-8ac3-8b0395f22a22 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:36:31.144 121086 WARNING nova.scheduler.filters.availability_zone_filter [req-8f6fda54-f899-466d-8ac3-8b0395f22a22 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:36:31.227 121086 ERROR nova.scheduler.client.report [req-8f6fda54-f899-466d-8ac3-8b0395f22a22 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:31.227 121086 CRITICAL nova [req-8f6fda54-f899-466d-8ac3-8b0395f22a22 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:31.227 121086 ERROR nova Traceback (most recent call last): 2026-04-02 00:36:31.227 121086 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:36:31.227 121086 ERROR nova sys.exit(main()) 2026-04-02 00:36:31.227 121086 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:36:31.227 121086 ERROR nova server = service.Service.create( 2026-04-02 00:36:31.227 121086 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:36:31.227 121086 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:36:31.227 121086 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:36:31.227 121086 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:36:31.227 121086 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:36:31.227 121086 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:36:31.227 121086 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:36:31.227 121086 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:36:31.227 121086 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:36:31.227 121086 ERROR nova self._client = self._create_client() 2026-04-02 00:36:31.227 121086 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:36:31.227 121086 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:36:31.227 121086 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:36:31.227 121086 ERROR nova return getattr(conn, service_type) 2026-04-02 00:36:31.227 121086 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:36:31.227 121086 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:36:31.227 121086 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:36:31.227 121086 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:36:31.227 121086 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:36:31.227 121086 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:36:31.227 121086 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:36:31.227 121086 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:36:31.227 121086 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:36:31.227 121086 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:36:31.227 121086 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:31.227 121086 ERROR nova 2026-04-02 00:36:33.386 121554 DEBUG oslo_db.sqlalchemy.engines [req-5ae4eb13-0f60-47b0-ae80-d99dbc764fc6 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:36:33.423 121554 DEBUG nova.scheduler.host_manager [req-5ae4eb13-0f60-47b0-ae80-d99dbc764fc6 - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:36:33.423 121554 DEBUG nova.scheduler.host_manager [req-5ae4eb13-0f60-47b0-ae80-d99dbc764fc6 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:36:33.427 121554 WARNING nova.scheduler.filters.availability_zone_filter [req-5ae4eb13-0f60-47b0-ae80-d99dbc764fc6 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:36:33.510 121554 ERROR nova.scheduler.client.report [req-5ae4eb13-0f60-47b0-ae80-d99dbc764fc6 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:33.511 121554 CRITICAL nova [req-5ae4eb13-0f60-47b0-ae80-d99dbc764fc6 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:33.511 121554 ERROR nova Traceback (most recent call last): 2026-04-02 00:36:33.511 121554 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:36:33.511 121554 ERROR nova sys.exit(main()) 2026-04-02 00:36:33.511 121554 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:36:33.511 121554 ERROR nova server = service.Service.create( 2026-04-02 00:36:33.511 121554 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:36:33.511 121554 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:36:33.511 121554 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:36:33.511 121554 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:36:33.511 121554 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:36:33.511 121554 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:36:33.511 121554 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:36:33.511 121554 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:36:33.511 121554 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:36:33.511 121554 ERROR nova self._client = self._create_client() 2026-04-02 00:36:33.511 121554 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:36:33.511 121554 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:36:33.511 121554 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:36:33.511 121554 ERROR nova return getattr(conn, service_type) 2026-04-02 00:36:33.511 121554 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:36:33.511 121554 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:36:33.511 121554 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:36:33.511 121554 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:36:33.511 121554 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:36:33.511 121554 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:36:33.511 121554 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:36:33.511 121554 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:36:33.511 121554 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:36:33.511 121554 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:36:33.511 121554 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:33.511 121554 ERROR nova 2026-04-02 00:36:35.820 122159 DEBUG oslo_db.sqlalchemy.engines [req-52be4804-127f-4818-847d-56ab73e54b79 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:36:35.857 122159 DEBUG nova.scheduler.host_manager [req-52be4804-127f-4818-847d-56ab73e54b79 - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:36:35.857 122159 DEBUG nova.scheduler.host_manager [req-52be4804-127f-4818-847d-56ab73e54b79 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:36:35.861 122159 WARNING nova.scheduler.filters.availability_zone_filter [req-52be4804-127f-4818-847d-56ab73e54b79 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:36:35.950 122159 ERROR nova.scheduler.client.report [req-52be4804-127f-4818-847d-56ab73e54b79 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:35.951 122159 CRITICAL nova [req-52be4804-127f-4818-847d-56ab73e54b79 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:35.951 122159 ERROR nova Traceback (most recent call last): 2026-04-02 00:36:35.951 122159 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:36:35.951 122159 ERROR nova sys.exit(main()) 2026-04-02 00:36:35.951 122159 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:36:35.951 122159 ERROR nova server = service.Service.create( 2026-04-02 00:36:35.951 122159 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:36:35.951 122159 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:36:35.951 122159 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:36:35.951 122159 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:36:35.951 122159 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:36:35.951 122159 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:36:35.951 122159 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:36:35.951 122159 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:36:35.951 122159 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:36:35.951 122159 ERROR nova self._client = self._create_client() 2026-04-02 00:36:35.951 122159 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:36:35.951 122159 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:36:35.951 122159 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:36:35.951 122159 ERROR nova return getattr(conn, service_type) 2026-04-02 00:36:35.951 122159 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:36:35.951 122159 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:36:35.951 122159 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:36:35.951 122159 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:36:35.951 122159 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:36:35.951 122159 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:36:35.951 122159 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:36:35.951 122159 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:36:35.951 122159 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:36:35.951 122159 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:36:35.951 122159 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:35.951 122159 ERROR nova 2026-04-02 00:36:38.438 122971 DEBUG oslo_db.sqlalchemy.engines [req-faa229a0-bc84-40df-9922-9015081d8a5c - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:36:38.482 122971 DEBUG nova.scheduler.host_manager [req-faa229a0-bc84-40df-9922-9015081d8a5c - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:36:38.482 122971 DEBUG nova.scheduler.host_manager [req-faa229a0-bc84-40df-9922-9015081d8a5c - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:36:38.489 122971 WARNING nova.scheduler.filters.availability_zone_filter [req-faa229a0-bc84-40df-9922-9015081d8a5c - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:36:38.596 122971 ERROR nova.scheduler.client.report [req-faa229a0-bc84-40df-9922-9015081d8a5c - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:38.597 122971 CRITICAL nova [req-faa229a0-bc84-40df-9922-9015081d8a5c - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:38.597 122971 ERROR nova Traceback (most recent call last): 2026-04-02 00:36:38.597 122971 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:36:38.597 122971 ERROR nova sys.exit(main()) 2026-04-02 00:36:38.597 122971 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:36:38.597 122971 ERROR nova server = service.Service.create( 2026-04-02 00:36:38.597 122971 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:36:38.597 122971 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:36:38.597 122971 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:36:38.597 122971 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:36:38.597 122971 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:36:38.597 122971 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:36:38.597 122971 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:36:38.597 122971 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:36:38.597 122971 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:36:38.597 122971 ERROR nova self._client = self._create_client() 2026-04-02 00:36:38.597 122971 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:36:38.597 122971 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:36:38.597 122971 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:36:38.597 122971 ERROR nova return getattr(conn, service_type) 2026-04-02 00:36:38.597 122971 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:36:38.597 122971 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:36:38.597 122971 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:36:38.597 122971 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:36:38.597 122971 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:36:38.597 122971 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:36:38.597 122971 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:36:38.597 122971 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:36:38.597 122971 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:36:38.597 122971 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:36:38.597 122971 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:38.597 122971 ERROR nova 2026-04-02 00:36:41.052 123833 DEBUG oslo_db.sqlalchemy.engines [req-fcc9eaef-43e7-4262-93d7-d22f52c125e7 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:36:41.083 123833 DEBUG nova.scheduler.host_manager [req-fcc9eaef-43e7-4262-93d7-d22f52c125e7 - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:36:41.083 123833 DEBUG nova.scheduler.host_manager [req-fcc9eaef-43e7-4262-93d7-d22f52c125e7 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:36:41.087 123833 WARNING nova.scheduler.filters.availability_zone_filter [req-fcc9eaef-43e7-4262-93d7-d22f52c125e7 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:36:41.168 123833 ERROR nova.scheduler.client.report [req-fcc9eaef-43e7-4262-93d7-d22f52c125e7 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:41.169 123833 CRITICAL nova [req-fcc9eaef-43e7-4262-93d7-d22f52c125e7 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:41.169 123833 ERROR nova Traceback (most recent call last): 2026-04-02 00:36:41.169 123833 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:36:41.169 123833 ERROR nova sys.exit(main()) 2026-04-02 00:36:41.169 123833 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:36:41.169 123833 ERROR nova server = service.Service.create( 2026-04-02 00:36:41.169 123833 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:36:41.169 123833 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:36:41.169 123833 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:36:41.169 123833 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:36:41.169 123833 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:36:41.169 123833 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:36:41.169 123833 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:36:41.169 123833 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:36:41.169 123833 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:36:41.169 123833 ERROR nova self._client = self._create_client() 2026-04-02 00:36:41.169 123833 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:36:41.169 123833 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:36:41.169 123833 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:36:41.169 123833 ERROR nova return getattr(conn, service_type) 2026-04-02 00:36:41.169 123833 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:36:41.169 123833 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:36:41.169 123833 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:36:41.169 123833 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:36:41.169 123833 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:36:41.169 123833 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:36:41.169 123833 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:36:41.169 123833 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:36:41.169 123833 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:36:41.169 123833 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:36:41.169 123833 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:41.169 123833 ERROR nova 2026-04-02 00:36:43.442 124965 DEBUG oslo_db.sqlalchemy.engines [req-5a8bb835-4f51-4bf1-8c62-6ac84901bb60 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:36:43.474 124965 DEBUG nova.scheduler.host_manager [req-5a8bb835-4f51-4bf1-8c62-6ac84901bb60 - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:36:43.475 124965 DEBUG nova.scheduler.host_manager [req-5a8bb835-4f51-4bf1-8c62-6ac84901bb60 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:36:43.478 124965 WARNING nova.scheduler.filters.availability_zone_filter [req-5a8bb835-4f51-4bf1-8c62-6ac84901bb60 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:36:43.560 124965 ERROR nova.scheduler.client.report [req-5a8bb835-4f51-4bf1-8c62-6ac84901bb60 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:43.561 124965 CRITICAL nova [req-5a8bb835-4f51-4bf1-8c62-6ac84901bb60 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:43.561 124965 ERROR nova Traceback (most recent call last): 2026-04-02 00:36:43.561 124965 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:36:43.561 124965 ERROR nova sys.exit(main()) 2026-04-02 00:36:43.561 124965 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:36:43.561 124965 ERROR nova server = service.Service.create( 2026-04-02 00:36:43.561 124965 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:36:43.561 124965 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:36:43.561 124965 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:36:43.561 124965 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:36:43.561 124965 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:36:43.561 124965 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:36:43.561 124965 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:36:43.561 124965 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:36:43.561 124965 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:36:43.561 124965 ERROR nova self._client = self._create_client() 2026-04-02 00:36:43.561 124965 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:36:43.561 124965 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:36:43.561 124965 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:36:43.561 124965 ERROR nova return getattr(conn, service_type) 2026-04-02 00:36:43.561 124965 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:36:43.561 124965 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:36:43.561 124965 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:36:43.561 124965 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:36:43.561 124965 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:36:43.561 124965 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:36:43.561 124965 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:36:43.561 124965 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:36:43.561 124965 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:36:43.561 124965 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:36:43.561 124965 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:43.561 124965 ERROR nova 2026-04-02 00:36:45.727 125756 DEBUG oslo_db.sqlalchemy.engines [req-ff61a71e-c66b-43a2-a867-6c69ff57b3eb - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:36:45.789 125756 DEBUG nova.scheduler.host_manager [req-ff61a71e-c66b-43a2-a867-6c69ff57b3eb - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:36:45.789 125756 DEBUG nova.scheduler.host_manager [req-ff61a71e-c66b-43a2-a867-6c69ff57b3eb - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:36:45.796 125756 WARNING nova.scheduler.filters.availability_zone_filter [req-ff61a71e-c66b-43a2-a867-6c69ff57b3eb - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:36:45.884 125756 ERROR nova.scheduler.client.report [req-ff61a71e-c66b-43a2-a867-6c69ff57b3eb - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:45.885 125756 CRITICAL nova [req-ff61a71e-c66b-43a2-a867-6c69ff57b3eb - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:45.885 125756 ERROR nova Traceback (most recent call last): 2026-04-02 00:36:45.885 125756 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:36:45.885 125756 ERROR nova sys.exit(main()) 2026-04-02 00:36:45.885 125756 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:36:45.885 125756 ERROR nova server = service.Service.create( 2026-04-02 00:36:45.885 125756 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:36:45.885 125756 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:36:45.885 125756 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:36:45.885 125756 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:36:45.885 125756 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:36:45.885 125756 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:36:45.885 125756 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:36:45.885 125756 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:36:45.885 125756 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:36:45.885 125756 ERROR nova self._client = self._create_client() 2026-04-02 00:36:45.885 125756 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:36:45.885 125756 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:36:45.885 125756 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:36:45.885 125756 ERROR nova return getattr(conn, service_type) 2026-04-02 00:36:45.885 125756 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:36:45.885 125756 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:36:45.885 125756 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:36:45.885 125756 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:36:45.885 125756 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:36:45.885 125756 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:36:45.885 125756 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:36:45.885 125756 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:36:45.885 125756 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:36:45.885 125756 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:36:45.885 125756 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:45.885 125756 ERROR nova 2026-04-02 00:36:48.211 126291 DEBUG oslo_db.sqlalchemy.engines [req-bb436942-2ca7-4189-b4eb-e18e36ce5384 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:36:48.243 126291 DEBUG nova.scheduler.host_manager [req-bb436942-2ca7-4189-b4eb-e18e36ce5384 - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:36:48.243 126291 DEBUG nova.scheduler.host_manager [req-bb436942-2ca7-4189-b4eb-e18e36ce5384 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:36:48.247 126291 WARNING nova.scheduler.filters.availability_zone_filter [req-bb436942-2ca7-4189-b4eb-e18e36ce5384 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:36:48.330 126291 ERROR nova.scheduler.client.report [req-bb436942-2ca7-4189-b4eb-e18e36ce5384 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:48.331 126291 CRITICAL nova [req-bb436942-2ca7-4189-b4eb-e18e36ce5384 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:48.331 126291 ERROR nova Traceback (most recent call last): 2026-04-02 00:36:48.331 126291 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:36:48.331 126291 ERROR nova sys.exit(main()) 2026-04-02 00:36:48.331 126291 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:36:48.331 126291 ERROR nova server = service.Service.create( 2026-04-02 00:36:48.331 126291 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:36:48.331 126291 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:36:48.331 126291 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:36:48.331 126291 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:36:48.331 126291 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:36:48.331 126291 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:36:48.331 126291 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:36:48.331 126291 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:36:48.331 126291 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:36:48.331 126291 ERROR nova self._client = self._create_client() 2026-04-02 00:36:48.331 126291 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:36:48.331 126291 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:36:48.331 126291 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:36:48.331 126291 ERROR nova return getattr(conn, service_type) 2026-04-02 00:36:48.331 126291 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:36:48.331 126291 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:36:48.331 126291 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:36:48.331 126291 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:36:48.331 126291 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:36:48.331 126291 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:36:48.331 126291 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:36:48.331 126291 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:36:48.331 126291 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:36:48.331 126291 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:36:48.331 126291 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:48.331 126291 ERROR nova 2026-04-02 00:36:50.790 126672 DEBUG oslo_db.sqlalchemy.engines [req-03cde6c1-8505-4a73-8172-f7daf3f32559 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:36:50.831 126672 DEBUG nova.scheduler.host_manager [req-03cde6c1-8505-4a73-8172-f7daf3f32559 - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:36:50.832 126672 DEBUG nova.scheduler.host_manager [req-03cde6c1-8505-4a73-8172-f7daf3f32559 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:36:50.842 126672 WARNING nova.scheduler.filters.availability_zone_filter [req-03cde6c1-8505-4a73-8172-f7daf3f32559 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:36:50.943 126672 ERROR nova.scheduler.client.report [req-03cde6c1-8505-4a73-8172-f7daf3f32559 - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:50.944 126672 CRITICAL nova [req-03cde6c1-8505-4a73-8172-f7daf3f32559 - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:50.944 126672 ERROR nova Traceback (most recent call last): 2026-04-02 00:36:50.944 126672 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:36:50.944 126672 ERROR nova sys.exit(main()) 2026-04-02 00:36:50.944 126672 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:36:50.944 126672 ERROR nova server = service.Service.create( 2026-04-02 00:36:50.944 126672 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:36:50.944 126672 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:36:50.944 126672 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:36:50.944 126672 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:36:50.944 126672 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:36:50.944 126672 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:36:50.944 126672 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:36:50.944 126672 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:36:50.944 126672 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:36:50.944 126672 ERROR nova self._client = self._create_client() 2026-04-02 00:36:50.944 126672 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:36:50.944 126672 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:36:50.944 126672 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:36:50.944 126672 ERROR nova return getattr(conn, service_type) 2026-04-02 00:36:50.944 126672 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:36:50.944 126672 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:36:50.944 126672 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:36:50.944 126672 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:36:50.944 126672 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:36:50.944 126672 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:36:50.944 126672 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:36:50.944 126672 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:36:50.944 126672 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:36:50.944 126672 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:36:50.944 126672 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:50.944 126672 ERROR nova 2026-04-02 00:36:53.275 126990 DEBUG oslo_db.sqlalchemy.engines [req-edaed531-52f9-4099-bdca-231f8d0616cb - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:36:53.316 126990 DEBUG nova.scheduler.host_manager [req-edaed531-52f9-4099-bdca-231f8d0616cb - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:36:53.316 126990 DEBUG nova.scheduler.host_manager [req-edaed531-52f9-4099-bdca-231f8d0616cb - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:36:53.321 126990 WARNING nova.scheduler.filters.availability_zone_filter [req-edaed531-52f9-4099-bdca-231f8d0616cb - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:36:53.410 126990 ERROR nova.scheduler.client.report [req-edaed531-52f9-4099-bdca-231f8d0616cb - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:53.410 126990 CRITICAL nova [req-edaed531-52f9-4099-bdca-231f8d0616cb - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:53.410 126990 ERROR nova Traceback (most recent call last): 2026-04-02 00:36:53.410 126990 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:36:53.410 126990 ERROR nova sys.exit(main()) 2026-04-02 00:36:53.410 126990 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:36:53.410 126990 ERROR nova server = service.Service.create( 2026-04-02 00:36:53.410 126990 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:36:53.410 126990 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:36:53.410 126990 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:36:53.410 126990 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:36:53.410 126990 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:36:53.410 126990 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:36:53.410 126990 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:36:53.410 126990 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:36:53.410 126990 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:36:53.410 126990 ERROR nova self._client = self._create_client() 2026-04-02 00:36:53.410 126990 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:36:53.410 126990 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:36:53.410 126990 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:36:53.410 126990 ERROR nova return getattr(conn, service_type) 2026-04-02 00:36:53.410 126990 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:36:53.410 126990 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:36:53.410 126990 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:36:53.410 126990 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:36:53.410 126990 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:36:53.410 126990 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:36:53.410 126990 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:36:53.410 126990 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:36:53.410 126990 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:36:53.410 126990 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:36:53.410 126990 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:53.410 126990 ERROR nova 2026-04-02 00:36:55.855 127473 DEBUG oslo_db.sqlalchemy.engines [req-5bc7d9e8-b154-4ee9-8d61-faeb138337be - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:36:55.889 127473 DEBUG nova.scheduler.host_manager [req-5bc7d9e8-b154-4ee9-8d61-faeb138337be - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:36:55.889 127473 DEBUG nova.scheduler.host_manager [req-5bc7d9e8-b154-4ee9-8d61-faeb138337be - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:36:55.893 127473 WARNING nova.scheduler.filters.availability_zone_filter [req-5bc7d9e8-b154-4ee9-8d61-faeb138337be - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:36:55.978 127473 ERROR nova.scheduler.client.report [req-5bc7d9e8-b154-4ee9-8d61-faeb138337be - - - - -] No authentication information found for placement API.: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:55.978 127473 CRITICAL nova [req-5bc7d9e8-b154-4ee9-8d61-faeb138337be - - - - -] Unhandled error: keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:55.978 127473 ERROR nova Traceback (most recent call last): 2026-04-02 00:36:55.978 127473 ERROR nova File "/usr/bin/nova-scheduler", line 10, in 2026-04-02 00:36:55.978 127473 ERROR nova sys.exit(main()) 2026-04-02 00:36:55.978 127473 ERROR nova File "/usr/lib/python3/dist-packages/nova/cmd/scheduler.py", line 47, in main 2026-04-02 00:36:55.978 127473 ERROR nova server = service.Service.create( 2026-04-02 00:36:55.978 127473 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 252, in create 2026-04-02 00:36:55.978 127473 ERROR nova service_obj = cls(host, binary, topic, manager, 2026-04-02 00:36:55.978 127473 ERROR nova File "/usr/lib/python3/dist-packages/nova/service.py", line 116, in __init__ 2026-04-02 00:36:55.978 127473 ERROR nova self.manager = manager_class(host=self.host, *args, **kwargs) 2026-04-02 00:36:55.978 127473 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/manager.py", line 69, in __init__ 2026-04-02 00:36:55.978 127473 ERROR nova self.placement_client = report.report_client_singleton() 2026-04-02 00:36:55.978 127473 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 91, in report_client_singleton 2026-04-02 00:36:55.978 127473 ERROR nova PLACEMENTCLIENT = SchedulerReportClient() 2026-04-02 00:36:55.978 127473 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 234, in __init__ 2026-04-02 00:36:55.978 127473 ERROR nova self._client = self._create_client() 2026-04-02 00:36:55.978 127473 ERROR nova File "/usr/lib/python3/dist-packages/nova/scheduler/client/report.py", line 277, in _create_client 2026-04-02 00:36:55.978 127473 ERROR nova client = self._adapter or utils.get_sdk_adapter('placement') 2026-04-02 00:36:55.978 127473 ERROR nova File "/usr/lib/python3/dist-packages/nova/utils.py", line 985, in get_sdk_adapter 2026-04-02 00:36:55.978 127473 ERROR nova return getattr(conn, service_type) 2026-04-02 00:36:55.978 127473 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 87, in __get__ 2026-04-02 00:36:55.978 127473 ERROR nova proxy = self._make_proxy(instance) 2026-04-02 00:36:55.978 127473 ERROR nova File "/usr/lib/python3/dist-packages/openstack/service_description.py", line 262, in _make_proxy 2026-04-02 00:36:55.978 127473 ERROR nova found_version = temp_adapter.get_api_major_version() 2026-04-02 00:36:55.978 127473 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/adapter.py", line 354, in get_api_major_version 2026-04-02 00:36:55.978 127473 ERROR nova return self.session.get_api_major_version(auth or self.auth, **kwargs) 2026-04-02 00:36:55.978 127473 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1275, in get_api_major_version 2026-04-02 00:36:55.978 127473 ERROR nova auth = self._auth_required(auth, 'determine endpoint URL') 2026-04-02 00:36:55.978 127473 ERROR nova File "/usr/lib/python3/dist-packages/keystoneauth1/session.py", line 1181, in _auth_required 2026-04-02 00:36:55.978 127473 ERROR nova raise exceptions.MissingAuthPlugin(msg_fmt % msg) 2026-04-02 00:36:55.978 127473 ERROR nova keystoneauth1.exceptions.auth_plugins.MissingAuthPlugin: An auth plugin is required to determine endpoint URL 2026-04-02 00:36:55.978 127473 ERROR nova 2026-04-02 00:38:35.970 142933 DEBUG oslo_db.sqlalchemy.engines [req-c2b40bfd-4e1a-41ad-9d8c-e7442894eca0 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:38:36.018 142933 DEBUG nova.scheduler.host_manager [req-c2b40bfd-4e1a-41ad-9d8c-e7442894eca0 - - - - -] Found 1 cells: 93adf21c-64ef-44ce-b3d6-e2add9f3b92e refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:730 2026-04-02 00:38:36.018 142933 DEBUG nova.scheduler.host_manager [req-c2b40bfd-4e1a-41ad-9d8c-e7442894eca0 - - - - -] Found 0 disabled cells: refresh_cells_caches /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:742 2026-04-02 00:38:36.024 142933 WARNING nova.scheduler.filters.availability_zone_filter [req-c2b40bfd-4e1a-41ad-9d8c-e7442894eca0 - - - - -] The 'AvailabilityZoneFilter' is deprecated since the 24.0.0 (Xena) release. Since the 18.0.0 (Rocky) release, nova has supported mapping AZs to placement aggregates. The feature is enabled by the 'query_placement_for_availability_zone' config option and is now enabled by default. As such, the 'AvailabilityZoneFilter' is no longer required. Nova is currently configured to use both placement and the AvailabilityZoneFilter for AZ enforcement. 2026-04-02 00:38:36.132 142933 DEBUG nova.scheduler.host_manager [req-c2b40bfd-4e1a-41ad-9d8c-e7442894eca0 - - - - -] START:_async_init_instance_info _async_init_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:411 2026-04-02 00:38:36.134 142933 DEBUG oslo_concurrency.lockutils [req-d5e3fea0-a431-4d97-8e49-4cfa0c11815f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:38:36.136 142933 DEBUG oslo_concurrency.lockutils [req-d5e3fea0-a431-4d97-8e49-4cfa0c11815f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:38:36.146 142933 DEBUG oslo_db.sqlalchemy.engines [req-d5e3fea0-a431-4d97-8e49-4cfa0c11815f - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:38:36.171 142933 DEBUG nova.scheduler.host_manager [req-d5e3fea0-a431-4d97-8e49-4cfa0c11815f - - - - -] Total number of compute nodes: 0 _async_init_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:424 2026-04-02 00:38:36.172 142933 DEBUG oslo_concurrency.lockutils [req-d5e3fea0-a431-4d97-8e49-4cfa0c11815f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:38:36.173 142933 DEBUG oslo_concurrency.lockutils [req-d5e3fea0-a431-4d97-8e49-4cfa0c11815f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:38:36.228 142933 DEBUG nova.scheduler.host_manager [req-d5e3fea0-a431-4d97-8e49-4cfa0c11815f - - - - -] Adding 0 instances for hosts 10-20 _async_init_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:443 2026-04-02 00:38:36.229 142933 DEBUG nova.scheduler.host_manager [req-d5e3fea0-a431-4d97-8e49-4cfa0c11815f - - - - -] END:_async_init_instance_info _async_init_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:454 2026-04-02 00:38:37.846 142933 DEBUG nova.context [req-c2b40bfd-4e1a-41ad-9d8c-e7442894eca0 - - - - -] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),93adf21c-64ef-44ce-b3d6-e2add9f3b92e(cell1) load_cells /usr/lib/python3/dist-packages/nova/context.py:464 2026-04-02 00:38:37.847 142933 DEBUG oslo_concurrency.lockutils [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:38:37.848 142933 DEBUG oslo_concurrency.lockutils [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:38:37.848 142933 DEBUG oslo_concurrency.lockutils [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:38:37.848 142933 DEBUG oslo_concurrency.lockutils [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:38:37.859 142933 DEBUG oslo_db.sqlalchemy.engines [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:38:37.904 142933 DEBUG oslo_concurrency.lockutils [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:294 2026-04-02 00:38:37.905 142933 DEBUG oslo_concurrency.lockutils [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:312 2026-04-02 00:38:37.905 142933 INFO oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] Starting 4 workers 2026-04-02 00:38:37.909 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] Started child 143779 _start_child /usr/lib/python3/dist-packages/oslo_service/service.py:575 2026-04-02 00:38:37.916 143779 INFO nova.service [-] Starting scheduler node (version 25.2.1) 2026-04-02 00:38:37.915 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] Started child 143780 _start_child /usr/lib/python3/dist-packages/oslo_service/service.py:575 2026-04-02 00:38:37.920 143780 INFO nova.service [-] Starting scheduler node (version 25.2.1) 2026-04-02 00:38:37.921 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] Started child 143781 _start_child /usr/lib/python3/dist-packages/oslo_service/service.py:575 2026-04-02 00:38:37.928 143781 INFO nova.service [-] Starting scheduler node (version 25.2.1) 2026-04-02 00:38:37.931 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] Started child 143787 _start_child /usr/lib/python3/dist-packages/oslo_service/service.py:575 2026-04-02 00:38:37.935 143779 DEBUG oslo_db.sqlalchemy.engines [req-7a7d92a7-5fc0-4b6a-b3e2-f546b194b011 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:38:37.935 143787 INFO nova.service [-] Starting scheduler node (version 25.2.1) 2026-04-02 00:38:37.936 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] Full set of CONF: wait /usr/lib/python3/dist-packages/oslo_service/service.py:649 2026-04-02 00:38:37.936 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2589 2026-04-02 00:38:37.937 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2590 2026-04-02 00:38:37.937 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] command line args: ['--config-file=/etc/nova/nova.conf', '--log-file=/var/log/nova/nova-scheduler.log'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2591 2026-04-02 00:38:37.937 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] config files: ['/etc/nova/nova.conf'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2592 2026-04-02 00:38:37.938 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ================================================================================ log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2594 2026-04-02 00:38:37.938 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.938 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.939 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.934 143780 DEBUG oslo_db.sqlalchemy.engines [req-38d9c1a9-887e-42ec-a4df-e5367368ef31 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:38:37.939 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.939 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cert = self.pem log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.940 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.941 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.941 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] config_dir = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.941 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.942 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] config_file = ['/etc/nova/nova.conf'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.942 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] config_source = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.942 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] console_host = juju-6f200b-0-lxd-7 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.943 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] control_exchange = nova log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.943 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cpu_allocation_ratio = 2.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.943 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] daemon = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.943 143781 DEBUG oslo_db.sqlalchemy.engines [req-7f726e0b-7c47-4937-aa26-2cfcf0a13662 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:38:37.944 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] debug = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.944 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.944 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.944 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.945 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=DEBUG', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.945 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.946 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] disk_allocation_ratio = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.946 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] enable_new_services = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.947 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.947 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.947 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] fatal_deprecations = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.948 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] flat_injected = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.948 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] force_config_drive = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.948 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] force_raw_images = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.949 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.949 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.949 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] host = juju-6f200b-0-lxd-7 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.950 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] initial_cpu_allocation_ratio = 16.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.950 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] initial_disk_allocation_ratio = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.950 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] initial_ram_allocation_ratio = 1.5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.951 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] injected_network_template = /usr/lib/python3/dist-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.952 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.952 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.952 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.952 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.953 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.953 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.953 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.954 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.954 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.954 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] key = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.954 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.955 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] log_config_append = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.955 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.955 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] log_dir = /var/log/nova log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.956 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] log_file = /var/log/nova/nova-scheduler.log log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.956 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] log_options = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.956 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.956 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.957 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] log_rotation_type = none log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.957 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.958 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.958 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.958 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.958 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.959 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.959 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.959 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.960 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.963 143787 DEBUG oslo_db.sqlalchemy.engines [req-5c84845e-c5eb-4a7a-a293-4704a0f425c6 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:38:37.965 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.965 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] max_logfile_count = 30 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.966 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.966 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.966 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.967 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] metadata_listen_port = 8765 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.968 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] metadata_workers = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.968 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.968 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] mkisofs_cmd = genisoimage log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.969 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] my_block_storage_ip = 252.41.81.150 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.969 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] my_ip = 252.41.81.150 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.969 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.971 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.972 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.972 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] osapi_compute_listen_port = 8764 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.972 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.972 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] osapi_compute_workers = 4 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.973 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] password_length = 12 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.973 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] periodic_enable = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.973 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.973 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.974 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] preallocate_images = none log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.974 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] publish_errors = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.974 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] pybasedir = /usr/lib/python3/dist-packages log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.975 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ram_allocation_ratio = 0.98 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.975 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.976 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.976 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.977 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.978 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.978 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] record = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.978 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] report_interval = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.978 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.979 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.979 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.979 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.979 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.980 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.980 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.981 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.982 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.982 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.982 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.982 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.983 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.983 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.983 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.983 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.983 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.984 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.984 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.984 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.984 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ssl_only = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.984 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.985 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.985 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.985 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.985 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] tempdir = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.986 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.987 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] transport_url = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.987 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.987 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] use_cow_images = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.987 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] use_eventlog = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.988 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] use_journal = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.989 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] use_json = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.989 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.989 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] use_stderr = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.989 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] use_syslog = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.990 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.991 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.991 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.991 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.991 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.992 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] watch_log_file = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.992 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2602 2026-04-02 00:38:37.993 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.993 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_concurrency.lock_path = /var/lock/nova log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.993 142933 WARNING oslo_config.cfg [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] Deprecated: Option "auth_strategy" from group "api" is deprecated for removal ( The only non-default choice, ``noauth2``, is for internal development and testing purposes only and should not be used in deployments. This option and its middleware, NoAuthMiddleware[V2_18], will be removed in a future release. ). Its value may be silently ignored in the future. 2026-04-02 00:38:37.994 143781 DEBUG nova.service [req-7f726e0b-7c47-4937-aa26-2cfcf0a13662 - - - - -] Creating RPC server for service scheduler start /usr/lib/python3/dist-packages/nova/service.py:182 2026-04-02 00:38:37.994 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.994 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.995 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.995 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api.dhcp_domain = novalocal log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.995 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.996 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.996 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.996 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.996 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.996 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.997 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.997 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.997 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.997 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.997 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.998 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.998 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.998 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.998 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.998 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.999 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.999 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.999 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.999 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cache.backend = dogpile.cache.null log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:37.999 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.000 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.000 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.000 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cache.enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.000 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.000 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.001 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.001 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.001 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.001 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.001 143779 DEBUG nova.service [req-7a7d92a7-5fc0-4b6a-b3e2-f546b194b011 - - - - -] Creating RPC server for service scheduler start /usr/lib/python3/dist-packages/nova/service.py:182 2026-04-02 00:38:38.001 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.002 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.002 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.002 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.002 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.002 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.003 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.002 143780 DEBUG nova.service [req-38d9c1a9-887e-42ec-a4df-e5367368ef31 - - - - -] Creating RPC server for service scheduler start /usr/lib/python3/dist-packages/nova/service.py:182 2026-04-02 00:38:38.003 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.003 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.003 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cinder.auth_type = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.003 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.004 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cinder.catalog_info = volumev3::publicURL log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.004 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.004 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.004 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.004 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cinder.debug = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.005 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.005 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.005 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.005 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.006 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cinder.os_region_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.006 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.006 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.006 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.006 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.007 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.007 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.007 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.007 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.007 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.008 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] compute.packing_host_numa_cells_allocation_strategy = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.008 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.008 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.008 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.008 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.009 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] conductor.workers = 4 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.009 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.009 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.009 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.010 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.010 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.010 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.010 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.010 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.010 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.011 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.011 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.011 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.011 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.011 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.012 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.012 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.012 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.012 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.013 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.013 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.013 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.013 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.013 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] cyborg.version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.014 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.014 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] database.connection = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.014 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.014 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.014 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.015 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.015 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.015 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.015 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.016 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.017 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.017 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] database.max_pool_size = 4 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.018 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.018 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.018 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.018 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.018 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.019 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.020 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.020 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] database.use_db_reconnect = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.020 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.020 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.020 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.021 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.021 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.021 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.021 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.022 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.022 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.022 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.022 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.022 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api_database.max_pool_size = 4 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.022 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.023 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.023 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.023 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.023 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.024 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.024 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.024 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.024 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.024 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.025 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.025 142933 WARNING oslo_config.cfg [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] Deprecated: Option "api_servers" from group "glance" is deprecated for removal ( Support for image service configuration via standard keystoneauth1 Adapter options was added in the 17.0.0 Queens release. The api_servers option was retained temporarily to allow consumers time to cut over to a real load balancing solution. ). Its value may be silently ignored in the future. 2026-04-02 00:38:38.025 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.api_servers = ['http://252.41.108.180:9292'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.025 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.026 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.026 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.026 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.026 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.026 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.debug = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.027 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.027 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.027 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.027 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.027 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.028 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.028 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.max_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.028 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.min_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.028 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.028 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.028 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.029 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.029 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.029 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.region_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.029 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.service_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.029 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.service_type = image log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.030 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.030 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.030 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.030 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.030 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.030 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.031 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] glance.version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.031 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.030 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:38:38.031 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:38:38.031 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.031 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.031 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.032 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.032 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.032 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.032 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.032 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.033 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.033 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.033 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.033 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.033 143781 DEBUG nova.service [req-7f726e0b-7c47-4937-aa26-2cfcf0a13662 - - - - -] Join ServiceGroup membership for this service scheduler start /usr/lib/python3/dist-packages/nova/service.py:199 2026-04-02 00:38:38.033 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.033 143781 DEBUG nova.servicegroup.drivers.db [req-7f726e0b-7c47-4937-aa26-2cfcf0a13662 - - - - -] DB_Driver: join new ServiceGroup member juju-6f200b-0-lxd-7 to the scheduler group, service = join /usr/lib/python3/dist-packages/nova/servicegroup/drivers/db.py:44 2026-04-02 00:38:38.034 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.034 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.034 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.034 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.034 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.034 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] mks.enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.035 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.035 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.036 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.036 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.036 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.036 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.037 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.037 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.037 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.037 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.037 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.037 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:38:38.038 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.038 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:38:38.038 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.038 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.038 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.038 143780 DEBUG nova.service [req-38d9c1a9-887e-42ec-a4df-e5367368ef31 - - - - -] Join ServiceGroup membership for this service scheduler start /usr/lib/python3/dist-packages/nova/service.py:199 2026-04-02 00:38:38.038 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.039 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.039 143780 DEBUG nova.servicegroup.drivers.db [req-38d9c1a9-887e-42ec-a4df-e5367368ef31 - - - - -] DB_Driver: join new ServiceGroup member juju-6f200b-0-lxd-7 to the scheduler group, service = join /usr/lib/python3/dist-packages/nova/servicegroup/drivers/db.py:44 2026-04-02 00:38:38.039 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.039 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.039 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.039 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:38:38.039 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.040 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.040 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.040 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:38:38.040 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:38:38.040 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.040 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.040 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.041 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.041 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.041 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.041 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.041 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:38:38.041 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.041 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:38:38.041 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.042 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:38:38.042 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ironic.version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.042 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.042 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.042 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.043 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.043 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.043 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican.barbican_endpoint_type = public log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.043 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican.barbican_region_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.043 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.044 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.043 143787 DEBUG nova.service [req-5c84845e-c5eb-4a7a-a293-4704a0f425c6 - - - - -] Creating RPC server for service scheduler start /usr/lib/python3/dist-packages/nova/service.py:182 2026-04-02 00:38:38.044 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.044 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.044 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.044 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.044 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.045 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.045 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.045 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.045 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.045 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.046 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.046 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.046 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:38:38.046 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:38:38.046 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.046 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.047 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.047 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.047 143779 DEBUG nova.service [req-7a7d92a7-5fc0-4b6a-b3e2-f546b194b011 - - - - -] Join ServiceGroup membership for this service scheduler start /usr/lib/python3/dist-packages/nova/service.py:199 2026-04-02 00:38:38.047 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.047 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.047 143779 DEBUG nova.servicegroup.drivers.db [req-7a7d92a7-5fc0-4b6a-b3e2-f546b194b011 - - - - -] DB_Driver: join new ServiceGroup member juju-6f200b-0-lxd-7 to the scheduler group, service = join /usr/lib/python3/dist-packages/nova/servicegroup/drivers/db.py:44 2026-04-02 00:38:38.047 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.047 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.048 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.048 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vault.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.048 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vault.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.048 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.048 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vault.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.049 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.049 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.049 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.049 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vault.namespace = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.049 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.049 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.050 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.050 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vault.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.050 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.050 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.050 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:38:38.050 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.050 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:38:38.051 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:38:38.051 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.051 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.051 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.051 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.051 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.051 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.052 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.052 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.052 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.052 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.052 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.053 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.053 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.053 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.053 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.053 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.053 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.054 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] keystone.version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.054 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.054 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.cpu_mode = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.054 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.055 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.055 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.055 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.055 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.055 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.055 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.056 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.056 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.056 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.056 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.hw_machine_type = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.056 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.images_rbd_ceph_conf = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.057 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.057 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.057 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.images_rbd_glance_store_name = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.057 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.images_rbd_pool = rbd log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.057 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.images_type = default log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.058 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.058 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.058 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.058 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.058 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.059 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.059 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.059 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.059 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.059 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.060 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.060 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.060 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.live_migration_permit_auto_converge = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.060 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.live_migration_permit_post_copy = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.060 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.060 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.live_migration_timeout_action = abort log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.061 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.061 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.live_migration_uri = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.061 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.061 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.061 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.063 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.063 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.063 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.063 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.064 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.064 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.064 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.num_pcie_ports = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.064 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.064 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.065 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.065 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.066 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.066 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.066 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.067 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.rbd_secret_uuid = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.067 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.rbd_user = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.067 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.067 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.068 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.068 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.068 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.069 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.069 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.rx_queue_size = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.069 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.069 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:38:38.070 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.070 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:38:38.070 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.070 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.070 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.071 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.072 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.swtpm_enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.072 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.072 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.071 143787 DEBUG nova.service [req-5c84845e-c5eb-4a7a-a293-4704a0f425c6 - - - - -] Join ServiceGroup membership for this service scheduler start /usr/lib/python3/dist-packages/nova/service.py:199 2026-04-02 00:38:38.072 143787 DEBUG nova.servicegroup.drivers.db [req-5c84845e-c5eb-4a7a-a293-4704a0f425c6 - - - - -] DB_Driver: join new ServiceGroup member juju-6f200b-0-lxd-7 to the scheduler group, service = join /usr/lib/python3/dist-packages/nova/servicegroup/drivers/db.py:44 2026-04-02 00:38:38.072 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.073 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.tx_queue_size = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.074 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.074 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.075 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.075 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.075 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.075 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:38:38.075 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.volume_use_multipath = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.075 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:38:38.075 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:38:38.075 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.075 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.076 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.076 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.076 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.076 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.077 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.077 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.077 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.077 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.077 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.078 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.078 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.078 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.078 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.078 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.079 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.079 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.079 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.079 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.079 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.079 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.080 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.080 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.080 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.080 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.080 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.region_name = RegionOne log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.080 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.081 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.081 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.081 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.081 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.081 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.082 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.082 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.082 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] neutron.version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.082 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.083 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.083 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.083 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.083 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.083 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] pci.alias = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.084 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] pci.passthrough_whitelist = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.084 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.084 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.084 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.auth_url = http://252.41.11.116:35357 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.084 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.085 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.085 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.085 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.085 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.085 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.085 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.086 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.086 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.086 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.086 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.086 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.086 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.max_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.087 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.min_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.087 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.password = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.087 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.087 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.project_domain_name = service_domain log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.087 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.project_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.087 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.project_name = services log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.088 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.region_name = RegionOne log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.088 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.service_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.088 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.088 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.088 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.089 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.089 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.089 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.089 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.089 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.089 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.user_domain_name = service_domain log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.090 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.user_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.090 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.username = nova log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.090 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.090 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] placement.version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.090 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] powervm.disk_driver = localdisk log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.090 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] powervm.proc_units_factor = 0.1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.091 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] powervm.volume_group_name = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.091 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.091 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.091 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.091 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.091 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.092 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.092 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.092 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.092 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.092 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.092 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.093 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.093 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.093 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.093 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.094 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] scheduler.discover_hosts_in_cells_interval = 30 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.094 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.094 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.094 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.094 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.095 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.095 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.095 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.095 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.095 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.095 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] scheduler.workers = 4 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.095 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.096 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.096 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.096 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] filter_scheduler.build_failure_weight_multiplier = 0.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.096 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.096 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.097 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.097 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] filter_scheduler.enabled_filters = ['AvailabilityZoneFilter', 'ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'DifferentHostFilter', 'SameHostFilter'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.097 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.097 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.097 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.098 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.098 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.098 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.098 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.098 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.098 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.099 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.099 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.099 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.099 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.099 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.099 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.100 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] metrics.required = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.100 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.100 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.100 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.100 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] serial_console.base_url = wss://252.41.81.150:6083/ log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.101 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.101 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.101 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.101 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.105 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.105 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.105 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.106 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.106 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.106 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.106 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.106 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.106 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.106 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.107 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.107 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.107 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] spice.enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.107 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.107 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.108 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.108 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.108 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.108 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.108 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.109 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.109 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.109 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.109 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.109 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.109 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.110 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.110 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.110 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.110 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.110 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.110 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.111 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.111 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.111 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.111 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.111 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.111 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.111 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.112 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.112 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.112 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.112 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.112 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.112 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.113 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.113 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.113 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.113 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.113 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.114 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.114 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.114 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.114 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.114 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.115 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.115 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.115 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.115 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.115 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.116 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.116 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.116 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vnc.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.116 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vnc.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.116 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.116 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.117 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.117 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.117 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.117 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.117 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.117 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.118 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.118 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.118 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] workarounds.enable_qemu_monitor_announce_self = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.118 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.118 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.118 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.119 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.119 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] workarounds.reserve_disk_resource_for_image_cache = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.119 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.119 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] workarounds.skip_cpu_compare_on_dest = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.119 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.119 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.120 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.120 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] wsgi.api_paste_config = /etc/nova/api-paste.ini log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.120 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.120 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.120 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.120 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.121 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.121 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.121 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.121 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.121 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.121 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.122 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.122 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.122 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.122 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.122 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.122 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.123 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.123 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.123 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.123 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.123 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_policy.enforce_scope = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.124 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.124 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.124 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.124 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.124 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.125 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.125 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.125 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.125 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.125 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.125 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] profiler.connection_string = messaging:// log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.126 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] profiler.enabled = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.126 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] profiler.es_doc_type = notification log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.126 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] profiler.es_scroll_size = 10000 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.126 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] profiler.es_scroll_time = 2m log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.126 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] profiler.filter_error_trace = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.126 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] profiler.hmac_keys = SECRET_KEY log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.127 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.127 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] profiler.socket_timeout = 0.1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.127 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] profiler.trace_sqlalchemy = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.127 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.127 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.128 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.128 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.128 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.128 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.128 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.128 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.128 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.129 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.129 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.129 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.129 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.129 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.130 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.130 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.130 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.130 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.130 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.130 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.130 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.131 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.131 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.131 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.131 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.131 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.131 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.132 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.132 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.132 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.132 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.132 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.133 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.133 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.133 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_limit.auth_type = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.133 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.133 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.133 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.134 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.134 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.134 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.134 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.134 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.134 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.134 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.135 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.135 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.135 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.135 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.135 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.135 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.136 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.136 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.136 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.136 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.136 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.136 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.136 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2609 2026-04-02 00:38:38.137 142933 DEBUG oslo_service.service [req-ca17f5dc-e811-405e-b4ef-feda61ffbe85 - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3/dist-packages/oslo_config/cfg.py:2613 2026-04-02 00:38:39.041 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:38:39.042 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:38:39.042 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:38:39.043 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:38:39.043 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:38:39.043 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:38:39.052 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:38:39.053 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:38:39.053 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:38:39.076 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:38:39.077 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:38:39.077 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:38:41.044 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:38:41.045 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:38:41.045 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:38:41.045 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:38:41.046 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:38:41.046 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:38:41.054 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:38:41.054 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:38:41.054 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:38:41.079 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:38:41.079 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:38:41.079 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:38:45.047 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:38:45.048 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:38:45.048 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:38:45.048 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:38:45.048 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:38:45.048 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:38:45.057 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:38:45.058 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:38:45.058 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:38:45.082 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:38:45.082 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:38:45.082 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:38:47.039 143781 DEBUG oslo_service.periodic_task [req-db894324-53be-4a50-bd74-3bd4b72b4e46 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:38:47.041 143781 DEBUG oslo_db.sqlalchemy.engines [req-db894324-53be-4a50-bd74-3bd4b72b4e46 - - - - -] Parent process 142933 forked (143781) with an open database connection, which is being discarded and recreated. checkout /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:413 2026-04-02 00:38:47.050 143781 DEBUG oslo_concurrency.lockutils [req-db894324-53be-4a50-bd74-3bd4b72b4e46 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:38:47.052 143781 DEBUG oslo_concurrency.lockutils [req-db894324-53be-4a50-bd74-3bd4b72b4e46 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:38:47.058 143781 DEBUG oslo_db.sqlalchemy.engines [req-db894324-53be-4a50-bd74-3bd4b72b4e46 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:38:53.049 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:38:53.050 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:38:53.050 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:38:53.051 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:38:53.051 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:38:53.051 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:38:53.059 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:38:53.059 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:38:53.060 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:38:53.083 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:38:53.084 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:38:53.084 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:39:17.076 143781 DEBUG oslo_service.periodic_task [req-db894324-53be-4a50-bd74-3bd4b72b4e46 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:39:17.081 143781 DEBUG oslo_concurrency.lockutils [req-f65eb130-81dc-45d9-9a39-417d8bd3e6de - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:39:17.081 143781 DEBUG oslo_concurrency.lockutils [req-f65eb130-81dc-45d9-9a39-417d8bd3e6de - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:39:17.128 143781 INFO nova.scheduler.manager [req-f65eb130-81dc-45d9-9a39-417d8bd3e6de - - - - -] Discovered 1 new hosts: cell1:cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 00:39:20.055 143779 DEBUG oslo_service.periodic_task [req-cf66eb98-9e3b-4003-9f64-2a148ed79f86 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:39:20.057 143779 DEBUG oslo_db.sqlalchemy.engines [req-cf66eb98-9e3b-4003-9f64-2a148ed79f86 - - - - -] Parent process 142933 forked (143779) with an open database connection, which is being discarded and recreated. checkout /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:413 2026-04-02 00:39:20.064 143779 DEBUG oslo_concurrency.lockutils [req-cf66eb98-9e3b-4003-9f64-2a148ed79f86 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:39:20.066 143779 DEBUG oslo_concurrency.lockutils [req-cf66eb98-9e3b-4003-9f64-2a148ed79f86 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:39:20.071 143779 DEBUG oslo_db.sqlalchemy.engines [req-cf66eb98-9e3b-4003-9f64-2a148ed79f86 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:39:23.021 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:39:23.022 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:39:23.022 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:39:23.028 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:39:23.029 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:39:23.029 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:39:23.034 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:39:23.034 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:39:23.034 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:39:23.072 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:39:23.073 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:39:23.073 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:39:26.042 143780 DEBUG oslo_service.periodic_task [req-6176f254-9b50-4996-a107-f5f785990ce7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:39:26.043 143780 DEBUG oslo_db.sqlalchemy.engines [req-6176f254-9b50-4996-a107-f5f785990ce7 - - - - -] Parent process 142933 forked (143780) with an open database connection, which is being discarded and recreated. checkout /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:413 2026-04-02 00:39:26.049 143780 DEBUG oslo_concurrency.lockutils [req-6176f254-9b50-4996-a107-f5f785990ce7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:39:26.051 143780 DEBUG oslo_concurrency.lockutils [req-6176f254-9b50-4996-a107-f5f785990ce7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:39:26.056 143780 DEBUG oslo_db.sqlalchemy.engines [req-6176f254-9b50-4996-a107-f5f785990ce7 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:39:30.081 143787 DEBUG oslo_service.periodic_task [req-c54a3452-1dca-4f8b-b0c4-3a53c9a91d15 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:39:30.084 143787 DEBUG oslo_db.sqlalchemy.engines [req-c54a3452-1dca-4f8b-b0c4-3a53c9a91d15 - - - - -] Parent process 142933 forked (143787) with an open database connection, which is being discarded and recreated. checkout /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:413 2026-04-02 00:39:30.091 143787 DEBUG oslo_concurrency.lockutils [req-c54a3452-1dca-4f8b-b0c4-3a53c9a91d15 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:39:30.093 143787 DEBUG oslo_concurrency.lockutils [req-c54a3452-1dca-4f8b-b0c4-3a53c9a91d15 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:39:30.099 143787 DEBUG oslo_db.sqlalchemy.engines [req-c54a3452-1dca-4f8b-b0c4-3a53c9a91d15 - - - - -] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python3/dist-packages/oslo_db/sqlalchemy/engines.py:314 2026-04-02 00:39:47.133 143781 DEBUG oslo_service.periodic_task [req-f65eb130-81dc-45d9-9a39-417d8bd3e6de - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:39:47.137 143781 DEBUG oslo_concurrency.lockutils [req-6efbbef2-1909-41c1-b926-7f2c5d3dc927 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:39:47.138 143781 DEBUG oslo_concurrency.lockutils [req-6efbbef2-1909-41c1-b926-7f2c5d3dc927 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:39:50.089 143779 DEBUG oslo_service.periodic_task [req-cf66eb98-9e3b-4003-9f64-2a148ed79f86 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:39:50.094 143779 DEBUG oslo_concurrency.lockutils [req-f823313c-d24e-4e7d-9486-3e304ba6a0a0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:39:50.095 143779 DEBUG oslo_concurrency.lockutils [req-f823313c-d24e-4e7d-9486-3e304ba6a0a0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:39:55.025 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:39:55.026 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:39:55.026 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:39:55.031 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:39:55.032 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:39:55.032 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:39:55.037 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:39:55.037 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:39:55.037 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:39:55.075 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:39:55.075 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:39:55.075 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:39:56.069 143780 DEBUG oslo_service.periodic_task [req-6176f254-9b50-4996-a107-f5f785990ce7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:39:56.073 143780 DEBUG oslo_concurrency.lockutils [req-162ac6c8-9bf9-4f59-907a-125e4e9c4729 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:39:56.074 143780 DEBUG oslo_concurrency.lockutils [req-162ac6c8-9bf9-4f59-907a-125e4e9c4729 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:40:00.115 143787 DEBUG oslo_service.periodic_task [req-c54a3452-1dca-4f8b-b0c4-3a53c9a91d15 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:40:00.120 143787 DEBUG oslo_concurrency.lockutils [req-4db9fac1-594d-4dfa-b617-f81a3b566a2c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:40:00.120 143787 DEBUG oslo_concurrency.lockutils [req-4db9fac1-594d-4dfa-b617-f81a3b566a2c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:40:17.146 143781 DEBUG oslo_service.periodic_task [req-6efbbef2-1909-41c1-b926-7f2c5d3dc927 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:40:17.151 143781 DEBUG oslo_concurrency.lockutils [req-54d6396f-752d-4ac6-855e-4c21241b2adb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:40:17.151 143781 DEBUG oslo_concurrency.lockutils [req-54d6396f-752d-4ac6-855e-4c21241b2adb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:40:20.106 143779 DEBUG oslo_service.periodic_task [req-f823313c-d24e-4e7d-9486-3e304ba6a0a0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:40:20.111 143779 DEBUG oslo_concurrency.lockutils [req-1c0419c5-cd7f-4833-a04f-980c8d069a40 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:40:20.111 143779 DEBUG oslo_concurrency.lockutils [req-1c0419c5-cd7f-4833-a04f-980c8d069a40 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:40:26.081 143780 DEBUG oslo_service.periodic_task [req-162ac6c8-9bf9-4f59-907a-125e4e9c4729 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:40:26.085 143780 DEBUG oslo_concurrency.lockutils [req-581de76a-7bcb-4fd0-86dc-849e58e58dac - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:40:26.085 143780 DEBUG oslo_concurrency.lockutils [req-581de76a-7bcb-4fd0-86dc-849e58e58dac - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:40:30.132 143787 DEBUG oslo_service.periodic_task [req-4db9fac1-594d-4dfa-b617-f81a3b566a2c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:40:30.136 143787 DEBUG oslo_concurrency.lockutils [req-54a2d541-bdfe-404a-8060-699b8c4693aa - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:40:30.136 143787 DEBUG oslo_concurrency.lockutils [req-54a2d541-bdfe-404a-8060-699b8c4693aa - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:40:47.158 143781 DEBUG oslo_service.periodic_task [req-54d6396f-752d-4ac6-855e-4c21241b2adb - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:40:47.163 143781 DEBUG oslo_concurrency.lockutils [req-6f441c48-ee4b-42d4-b008-5625fc7f58fd - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:40:47.163 143781 DEBUG oslo_concurrency.lockutils [req-6f441c48-ee4b-42d4-b008-5625fc7f58fd - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:40:51.057 143779 DEBUG oslo_service.periodic_task [req-1c0419c5-cd7f-4833-a04f-980c8d069a40 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:40:51.061 143779 DEBUG oslo_concurrency.lockutils [req-6dc1f2b3-d99d-4c31-8b1e-fd26402d8538 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:40:51.061 143779 DEBUG oslo_concurrency.lockutils [req-6dc1f2b3-d99d-4c31-8b1e-fd26402d8538 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:40:56.092 143780 DEBUG oslo_service.periodic_task [req-581de76a-7bcb-4fd0-86dc-849e58e58dac - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:40:56.097 143780 DEBUG oslo_concurrency.lockutils [req-68cbd25c-21f6-47a2-8fd7-5404ec59671e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:40:56.097 143780 DEBUG oslo_concurrency.lockutils [req-68cbd25c-21f6-47a2-8fd7-5404ec59671e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:40:59.029 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:40:59.030 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:40:59.030 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:40:59.033 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:40:59.034 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:40:59.034 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:40:59.043 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:40:59.044 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:40:59.044 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:40:59.082 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:40:59.082 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:40:59.082 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:00.140 143787 DEBUG oslo_service.periodic_task [req-54a2d541-bdfe-404a-8060-699b8c4693aa - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:41:00.144 143787 DEBUG oslo_concurrency.lockutils [req-787a8ac7-fe4f-458d-8287-4363b77f98a7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:41:00.144 143787 DEBUG oslo_concurrency.lockutils [req-787a8ac7-fe4f-458d-8287-4363b77f98a7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:41:03.967 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 026e537c9bf241b0b42fa6567d673ef0 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:41:03.967 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 026e537c9bf241b0b42fa6567d673ef0 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:41:03.967 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 026e537c9bf241b0b42fa6567d673ef0 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:41:03.967 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 026e537c9bf241b0b42fa6567d673ef0 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:41:03.968 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:03.968 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:03.968 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:03.968 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:03.968 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 026e537c9bf241b0b42fa6567d673ef0 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:41:03.968 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 026e537c9bf241b0b42fa6567d673ef0 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:41:03.968 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 026e537c9bf241b0b42fa6567d673ef0 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:41:03.968 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 026e537c9bf241b0b42fa6567d673ef0 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:41:03.968 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:03.968 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:03.968 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:03.968 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:03.968 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:03.968 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:03.968 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:03.968 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:03.969 143779 DEBUG oslo_concurrency.lockutils [req-1d517256-be5f-4e90-a678-98a9de84c230 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:41:03.969 143780 DEBUG oslo_concurrency.lockutils [req-1d517256-be5f-4e90-a678-98a9de84c230 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:41:03.969 143787 DEBUG oslo_concurrency.lockutils [req-1d517256-be5f-4e90-a678-98a9de84c230 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:41:03.969 143781 DEBUG oslo_concurrency.lockutils [req-1d517256-be5f-4e90-a678-98a9de84c230 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:41:03.970 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:41:03.970 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:41:03.970 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:03.970 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:03.970 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:41:03.971 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:03.971 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:03.971 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:03.971 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:03.974 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:41:03.974 143781 DEBUG oslo_concurrency.lockutils [req-1d517256-be5f-4e90-a678-98a9de84c230 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:41:03.974 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:03.974 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:03.974 143781 DEBUG oslo_concurrency.lockutils [req-1d517256-be5f-4e90-a678-98a9de84c230 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:41:03.977 143787 DEBUG oslo_concurrency.lockutils [req-1d517256-be5f-4e90-a678-98a9de84c230 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:41:03.977 143780 DEBUG oslo_concurrency.lockutils [req-1d517256-be5f-4e90-a678-98a9de84c230 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:41:03.977 143779 DEBUG oslo_concurrency.lockutils [req-1d517256-be5f-4e90-a678-98a9de84c230 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:41:03.977 143787 DEBUG oslo_concurrency.lockutils [req-1d517256-be5f-4e90-a678-98a9de84c230 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:41:03.977 143779 DEBUG oslo_concurrency.lockutils [req-1d517256-be5f-4e90-a678-98a9de84c230 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:41:03.977 143780 DEBUG oslo_concurrency.lockutils [req-1d517256-be5f-4e90-a678-98a9de84c230 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:41:03.978 143781 INFO nova.scheduler.host_manager [req-1d517256-be5f-4e90-a678-98a9de84c230 - - - - -] Received a sync request from an unknown host 'cn-jenkins-deploy-platform-juju-os-690-1'. Re-created its InstanceList. 2026-04-02 00:41:03.978 143781 DEBUG oslo_concurrency.lockutils [req-1d517256-be5f-4e90-a678-98a9de84c230 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.009s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:41:03.980 143780 INFO nova.scheduler.host_manager [req-1d517256-be5f-4e90-a678-98a9de84c230 - - - - -] Received a sync request from an unknown host 'cn-jenkins-deploy-platform-juju-os-690-1'. Re-created its InstanceList. 2026-04-02 00:41:03.980 143787 INFO nova.scheduler.host_manager [req-1d517256-be5f-4e90-a678-98a9de84c230 - - - - -] Received a sync request from an unknown host 'cn-jenkins-deploy-platform-juju-os-690-1'. Re-created its InstanceList. 2026-04-02 00:41:03.981 143780 DEBUG oslo_concurrency.lockutils [req-1d517256-be5f-4e90-a678-98a9de84c230 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.012s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:41:03.981 143787 DEBUG oslo_concurrency.lockutils [req-1d517256-be5f-4e90-a678-98a9de84c230 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.012s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:41:03.981 143779 INFO nova.scheduler.host_manager [req-1d517256-be5f-4e90-a678-98a9de84c230 - - - - -] Received a sync request from an unknown host 'cn-jenkins-deploy-platform-juju-os-690-1'. Re-created its InstanceList. 2026-04-02 00:41:03.981 143779 DEBUG oslo_concurrency.lockutils [req-1d517256-be5f-4e90-a678-98a9de84c230 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.013s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:41:04.971 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:41:04.971 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:41:04.972 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:04.972 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:04.972 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:04.972 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:04.973 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:41:04.973 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:04.973 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:04.976 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:41:04.976 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:04.977 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:06.974 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:41:06.974 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:06.974 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:06.974 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:41:06.974 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:06.974 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:41:06.975 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:06.975 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:06.975 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:06.979 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:41:06.979 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:06.979 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:10.977 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:41:10.978 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:10.978 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:10.978 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:41:10.978 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:10.979 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:10.979 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:41:10.979 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:10.979 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:10.983 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:41:10.983 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:10.983 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:18.041 143781 DEBUG oslo_service.periodic_task [req-6f441c48-ee4b-42d4-b008-5625fc7f58fd - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:41:18.046 143781 DEBUG oslo_concurrency.lockutils [req-29eb83f4-8d5f-4891-a83e-55572357da09 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:41:18.046 143781 DEBUG oslo_concurrency.lockutils [req-29eb83f4-8d5f-4891-a83e-55572357da09 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:41:18.980 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:41:18.981 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:18.981 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:18.982 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:41:18.982 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:18.982 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:18.983 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:41:18.983 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:18.984 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:18.989 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:41:18.990 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:18.990 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:22.057 143779 DEBUG oslo_service.periodic_task [req-6dc1f2b3-d99d-4c31-8b1e-fd26402d8538 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:41:22.062 143779 DEBUG oslo_concurrency.lockutils [req-c656fdfc-1ee1-4488-9424-15476118b1b5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:41:22.063 143779 DEBUG oslo_concurrency.lockutils [req-c656fdfc-1ee1-4488-9424-15476118b1b5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:41:27.045 143780 DEBUG oslo_service.periodic_task [req-68cbd25c-21f6-47a2-8fd7-5404ec59671e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:41:27.049 143780 DEBUG oslo_concurrency.lockutils [req-a0272eb7-a168-4144-9a27-c3a6020a39e3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:41:27.049 143780 DEBUG oslo_concurrency.lockutils [req-a0272eb7-a168-4144-9a27-c3a6020a39e3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:41:31.083 143787 DEBUG oslo_service.periodic_task [req-787a8ac7-fe4f-458d-8287-4363b77f98a7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:41:31.088 143787 DEBUG oslo_concurrency.lockutils [req-9470fe8f-7c7f-4f8f-892d-773f3da8f810 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:41:31.088 143787 DEBUG oslo_concurrency.lockutils [req-9470fe8f-7c7f-4f8f-892d-773f3da8f810 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:41:34.982 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:41:34.983 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:34.983 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:34.984 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:41:34.984 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:34.984 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:34.985 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:41:34.985 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:34.986 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:34.992 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:41:34.992 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:41:34.992 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:41:48.055 143781 DEBUG oslo_service.periodic_task [req-29eb83f4-8d5f-4891-a83e-55572357da09 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:41:48.060 143781 DEBUG oslo_concurrency.lockutils [req-a86dde6f-9255-40b9-a546-5342b824ec78 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:41:48.060 143781 DEBUG oslo_concurrency.lockutils [req-a86dde6f-9255-40b9-a546-5342b824ec78 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:41:53.057 143779 DEBUG oslo_service.periodic_task [req-c656fdfc-1ee1-4488-9424-15476118b1b5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:41:53.061 143779 DEBUG oslo_concurrency.lockutils [req-64931e58-91b4-4c92-9319-f0454c1daa0f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:41:53.061 143779 DEBUG oslo_concurrency.lockutils [req-64931e58-91b4-4c92-9319-f0454c1daa0f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:41:57.057 143780 DEBUG oslo_service.periodic_task [req-a0272eb7-a168-4144-9a27-c3a6020a39e3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:41:57.061 143780 DEBUG oslo_concurrency.lockutils [req-6a92307d-7256-420d-8762-4592052c6f3e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:41:57.061 143780 DEBUG oslo_concurrency.lockutils [req-6a92307d-7256-420d-8762-4592052c6f3e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:42:01.100 143787 DEBUG oslo_service.periodic_task [req-9470fe8f-7c7f-4f8f-892d-773f3da8f810 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:42:01.104 143787 DEBUG oslo_concurrency.lockutils [req-0b354bca-b4f2-49c8-a51c-35c23950ffdf - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:42:01.104 143787 DEBUG oslo_concurrency.lockutils [req-0b354bca-b4f2-49c8-a51c-35c23950ffdf - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:42:06.987 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:42:06.987 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:42:06.987 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:42:06.989 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:42:06.989 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:42:06.990 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:42:06.990 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:42:06.990 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:42:06.990 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:42:06.996 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:42:06.997 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:42:06.997 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:42:19.042 143781 DEBUG oslo_service.periodic_task [req-a86dde6f-9255-40b9-a546-5342b824ec78 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:42:19.047 143781 DEBUG oslo_concurrency.lockutils [req-35e76575-5b6a-43a2-b946-58ff22c01e2e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:42:19.048 143781 DEBUG oslo_concurrency.lockutils [req-35e76575-5b6a-43a2-b946-58ff22c01e2e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:42:23.066 143779 DEBUG oslo_service.periodic_task [req-64931e58-91b4-4c92-9319-f0454c1daa0f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:42:23.070 143779 DEBUG oslo_concurrency.lockutils [req-24faef25-d0e8-4282-b569-547d39f204bc - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:42:23.071 143779 DEBUG oslo_concurrency.lockutils [req-24faef25-d0e8-4282-b569-547d39f204bc - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:42:27.070 143780 DEBUG oslo_service.periodic_task [req-6a92307d-7256-420d-8762-4592052c6f3e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:42:27.074 143780 DEBUG oslo_concurrency.lockutils [req-76422a9f-6973-4a6a-9b19-49333dd80196 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:42:27.075 143780 DEBUG oslo_concurrency.lockutils [req-76422a9f-6973-4a6a-9b19-49333dd80196 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:42:32.083 143787 DEBUG oslo_service.periodic_task [req-0b354bca-b4f2-49c8-a51c-35c23950ffdf - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:42:32.087 143787 DEBUG oslo_concurrency.lockutils [req-b2cf42d5-5dec-4411-a2e2-08bd5cf84350 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:42:32.087 143787 DEBUG oslo_concurrency.lockutils [req-b2cf42d5-5dec-4411-a2e2-08bd5cf84350 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:42:49.058 143781 DEBUG oslo_service.periodic_task [req-35e76575-5b6a-43a2-b946-58ff22c01e2e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:42:49.062 143781 DEBUG oslo_concurrency.lockutils [req-401ae088-16b9-4007-92b3-1e77fe3306c7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:42:49.062 143781 DEBUG oslo_concurrency.lockutils [req-401ae088-16b9-4007-92b3-1e77fe3306c7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:42:53.075 143779 DEBUG oslo_service.periodic_task [req-24faef25-d0e8-4282-b569-547d39f204bc - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:42:53.079 143779 DEBUG oslo_concurrency.lockutils [req-d388cc92-9692-4070-b1bc-8760fe1b29ba - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:42:53.079 143779 DEBUG oslo_concurrency.lockutils [req-d388cc92-9692-4070-b1bc-8760fe1b29ba - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:42:57.081 143780 DEBUG oslo_service.periodic_task [req-76422a9f-6973-4a6a-9b19-49333dd80196 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:42:57.085 143780 DEBUG oslo_concurrency.lockutils [req-3764d61b-f839-42ba-a5e0-c72593c81dcb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:42:57.086 143780 DEBUG oslo_concurrency.lockutils [req-3764d61b-f839-42ba-a5e0-c72593c81dcb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:43:02.099 143787 DEBUG oslo_service.periodic_task [req-b2cf42d5-5dec-4411-a2e2-08bd5cf84350 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:43:02.103 143787 DEBUG oslo_concurrency.lockutils [req-7e554c65-d15d-4d9a-9f84-2caba6812d02 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:43:02.103 143787 DEBUG oslo_concurrency.lockutils [req-7e554c65-d15d-4d9a-9f84-2caba6812d02 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:43:04.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 9b56b95552bd4596b4d66e5bfa81d8b1 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:43:04.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 9b56b95552bd4596b4d66e5bfa81d8b1 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:43:04.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 9b56b95552bd4596b4d66e5bfa81d8b1 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:43:04.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:04.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9b56b95552bd4596b4d66e5bfa81d8b1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:43:04.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:04.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 9b56b95552bd4596b4d66e5bfa81d8b1 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:43:04.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9b56b95552bd4596b4d66e5bfa81d8b1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:43:04.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:04.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:04.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:04.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9b56b95552bd4596b4d66e5bfa81d8b1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:43:04.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:04.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:04.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:04.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:04.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9b56b95552bd4596b4d66e5bfa81d8b1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:43:04.821 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:04.821 143781 DEBUG oslo_concurrency.lockutils [req-2ead7fea-172d-4b56-8e76-5130158348a2 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:43:04.821 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:04.821 143779 DEBUG oslo_concurrency.lockutils [req-2ead7fea-172d-4b56-8e76-5130158348a2 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:43:04.821 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:04.821 143781 DEBUG nova.scheduler.host_manager [req-2ead7fea-172d-4b56-8e76-5130158348a2 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:43:04.821 143780 DEBUG oslo_concurrency.lockutils [req-2ead7fea-172d-4b56-8e76-5130158348a2 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:43:04.821 143781 DEBUG oslo_concurrency.lockutils [req-2ead7fea-172d-4b56-8e76-5130158348a2 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:43:04.821 143779 DEBUG nova.scheduler.host_manager [req-2ead7fea-172d-4b56-8e76-5130158348a2 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:43:04.821 143780 DEBUG nova.scheduler.host_manager [req-2ead7fea-172d-4b56-8e76-5130158348a2 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:43:04.821 143779 DEBUG oslo_concurrency.lockutils [req-2ead7fea-172d-4b56-8e76-5130158348a2 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:43:04.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:43:04.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:04.822 143780 DEBUG oslo_concurrency.lockutils [req-2ead7fea-172d-4b56-8e76-5130158348a2 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:43:04.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:04.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:43:04.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:04.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:04.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:43:04.822 143787 DEBUG oslo_concurrency.lockutils [req-2ead7fea-172d-4b56-8e76-5130158348a2 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:43:04.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:04.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:04.822 143787 DEBUG nova.scheduler.host_manager [req-2ead7fea-172d-4b56-8e76-5130158348a2 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:43:04.823 143787 DEBUG oslo_concurrency.lockutils [req-2ead7fea-172d-4b56-8e76-5130158348a2 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:43:04.823 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:43:04.823 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:04.823 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:05.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:43:05.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:05.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:05.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:43:05.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:05.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:05.823 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:43:05.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:05.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:05.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:43:05.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:05.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:07.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:43:07.825 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:43:07.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:43:07.826 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:07.826 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:07.826 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:07.826 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:07.826 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:07.826 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:07.827 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:43:07.828 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:07.828 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:11.830 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:43:11.830 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:11.830 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:11.830 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:43:11.830 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:11.830 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:43:11.831 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:11.831 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:11.831 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:11.831 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:43:11.831 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:11.831 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:19.833 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:43:19.833 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:19.833 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:19.837 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:43:19.837 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:43:19.837 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:43:19.838 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:19.838 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:19.838 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:19.838 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:19.838 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:19.838 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:20.041 143781 DEBUG oslo_service.periodic_task [req-401ae088-16b9-4007-92b3-1e77fe3306c7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:43:20.045 143781 DEBUG oslo_concurrency.lockutils [req-a260a8ab-ffc2-4b87-b2c0-2c819e68fa63 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:43:20.045 143781 DEBUG oslo_concurrency.lockutils [req-a260a8ab-ffc2-4b87-b2c0-2c819e68fa63 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:43:24.057 143779 DEBUG oslo_service.periodic_task [req-d388cc92-9692-4070-b1bc-8760fe1b29ba - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:43:24.061 143779 DEBUG oslo_concurrency.lockutils [req-83f22314-9463-487c-b554-91a807000c0d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:43:24.062 143779 DEBUG oslo_concurrency.lockutils [req-83f22314-9463-487c-b554-91a807000c0d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:43:28.045 143780 DEBUG oslo_service.periodic_task [req-3764d61b-f839-42ba-a5e0-c72593c81dcb - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:43:28.049 143780 DEBUG oslo_concurrency.lockutils [req-e728f468-a5e6-47a3-b8b5-c2973417e184 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:43:28.050 143780 DEBUG oslo_concurrency.lockutils [req-e728f468-a5e6-47a3-b8b5-c2973417e184 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:43:32.116 143787 DEBUG oslo_service.periodic_task [req-7e554c65-d15d-4d9a-9f84-2caba6812d02 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:43:32.120 143787 DEBUG oslo_concurrency.lockutils [req-f653e18c-e8a5-45f1-aa75-8086774cf334 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:43:32.120 143787 DEBUG oslo_concurrency.lockutils [req-f653e18c-e8a5-45f1-aa75-8086774cf334 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:43:35.835 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:43:35.835 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:35.836 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:35.839 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:43:35.839 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:35.839 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:35.840 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:43:35.840 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:35.840 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:35.840 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:43:35.840 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:43:35.840 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:43:50.053 143781 DEBUG oslo_service.periodic_task [req-a260a8ab-ffc2-4b87-b2c0-2c819e68fa63 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:43:50.057 143781 DEBUG oslo_concurrency.lockutils [req-6c420db5-4e6f-4044-834d-c70820910179 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:43:50.058 143781 DEBUG oslo_concurrency.lockutils [req-6c420db5-4e6f-4044-834d-c70820910179 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:43:55.058 143779 DEBUG oslo_service.periodic_task [req-83f22314-9463-487c-b554-91a807000c0d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:43:55.062 143779 DEBUG oslo_concurrency.lockutils [req-b4debcae-a2dd-46ba-9fd6-60df8a5b7e18 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:43:55.062 143779 DEBUG oslo_concurrency.lockutils [req-b4debcae-a2dd-46ba-9fd6-60df8a5b7e18 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:43:58.058 143780 DEBUG oslo_service.periodic_task [req-e728f468-a5e6-47a3-b8b5-c2973417e184 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:43:58.062 143780 DEBUG oslo_concurrency.lockutils [req-6648afae-52d5-41bb-aa40-992595a24c19 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:43:58.062 143780 DEBUG oslo_concurrency.lockutils [req-6648afae-52d5-41bb-aa40-992595a24c19 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:44:02.133 143787 DEBUG oslo_service.periodic_task [req-f653e18c-e8a5-45f1-aa75-8086774cf334 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:44:02.137 143787 DEBUG oslo_concurrency.lockutils [req-a9562ec1-224b-4529-9dc9-765129234d88 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:44:02.137 143787 DEBUG oslo_concurrency.lockutils [req-a9562ec1-224b-4529-9dc9-765129234d88 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:44:07.840 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:44:07.840 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:44:07.840 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:44:07.843 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:44:07.843 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:44:07.843 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:44:07.845 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:44:07.845 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:44:07.845 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:44:07.846 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:44:07.846 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:44:07.846 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:44:20.069 143781 DEBUG oslo_service.periodic_task [req-6c420db5-4e6f-4044-834d-c70820910179 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:44:20.073 143781 DEBUG oslo_concurrency.lockutils [req-2d0271b7-4c96-453a-b050-334b311a3e1b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:44:20.074 143781 DEBUG oslo_concurrency.lockutils [req-2d0271b7-4c96-453a-b050-334b311a3e1b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:44:25.068 143779 DEBUG oslo_service.periodic_task [req-b4debcae-a2dd-46ba-9fd6-60df8a5b7e18 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:44:25.072 143779 DEBUG oslo_concurrency.lockutils [req-fe956dfa-0eb7-483c-80fe-9521e460e8e1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:44:25.073 143779 DEBUG oslo_concurrency.lockutils [req-fe956dfa-0eb7-483c-80fe-9521e460e8e1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:44:28.069 143780 DEBUG oslo_service.periodic_task [req-6648afae-52d5-41bb-aa40-992595a24c19 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:44:28.073 143780 DEBUG oslo_concurrency.lockutils [req-793239dd-72b2-469c-9a43-91951f7dd028 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:44:28.073 143780 DEBUG oslo_concurrency.lockutils [req-793239dd-72b2-469c-9a43-91951f7dd028 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:44:32.146 143787 DEBUG oslo_service.periodic_task [req-a9562ec1-224b-4529-9dc9-765129234d88 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:44:32.149 143787 DEBUG oslo_concurrency.lockutils [req-58358509-3b66-457e-a2ea-45d258187d84 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:44:32.150 143787 DEBUG oslo_concurrency.lockutils [req-58358509-3b66-457e-a2ea-45d258187d84 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:44:50.081 143781 DEBUG oslo_service.periodic_task [req-2d0271b7-4c96-453a-b050-334b311a3e1b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:44:50.085 143781 DEBUG oslo_concurrency.lockutils [req-ead4427e-e0a7-4422-bd0e-874bc098ced0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:44:50.086 143781 DEBUG oslo_concurrency.lockutils [req-ead4427e-e0a7-4422-bd0e-874bc098ced0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:44:55.078 143779 DEBUG oslo_service.periodic_task [req-fe956dfa-0eb7-483c-80fe-9521e460e8e1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:44:55.082 143779 DEBUG oslo_concurrency.lockutils [req-b3b55165-f18f-4bd9-94fc-8a942f9e8a9a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:44:55.082 143779 DEBUG oslo_concurrency.lockutils [req-b3b55165-f18f-4bd9-94fc-8a942f9e8a9a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:44:59.044 143780 DEBUG oslo_service.periodic_task [req-793239dd-72b2-469c-9a43-91951f7dd028 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:44:59.048 143780 DEBUG oslo_concurrency.lockutils [req-7346cf14-6224-422a-a449-a2b35ec14db7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:44:59.048 143780 DEBUG oslo_concurrency.lockutils [req-7346cf14-6224-422a-a449-a2b35ec14db7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:45:02.163 143787 DEBUG oslo_service.periodic_task [req-58358509-3b66-457e-a2ea-45d258187d84 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:45:02.167 143787 DEBUG oslo_concurrency.lockutils [req-62b62a63-b8af-418d-b4b6-7d33a746d67a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:45:02.167 143787 DEBUG oslo_concurrency.lockutils [req-62b62a63-b8af-418d-b4b6-7d33a746d67a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:45:07.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8db7eceea3924c2caf5da7312097ae66 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:45:07.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8db7eceea3924c2caf5da7312097ae66 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:45:07.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8db7eceea3924c2caf5da7312097ae66 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:45:07.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8db7eceea3924c2caf5da7312097ae66 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:45:07.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:07.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:07.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:07.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:07.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8db7eceea3924c2caf5da7312097ae66 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:45:07.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8db7eceea3924c2caf5da7312097ae66 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:45:07.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8db7eceea3924c2caf5da7312097ae66 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:45:07.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8db7eceea3924c2caf5da7312097ae66 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:45:07.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:07.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:07.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:07.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:07.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:07.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:07.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:07.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:07.813 143781 DEBUG oslo_concurrency.lockutils [req-1a5b40c1-83f9-4e40-8d57-daab911645d5 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:45:07.813 143779 DEBUG oslo_concurrency.lockutils [req-1a5b40c1-83f9-4e40-8d57-daab911645d5 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:45:07.813 143787 DEBUG oslo_concurrency.lockutils [req-1a5b40c1-83f9-4e40-8d57-daab911645d5 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:45:07.813 143780 DEBUG oslo_concurrency.lockutils [req-1a5b40c1-83f9-4e40-8d57-daab911645d5 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:45:07.813 143781 DEBUG nova.scheduler.host_manager [req-1a5b40c1-83f9-4e40-8d57-daab911645d5 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:45:07.813 143779 DEBUG nova.scheduler.host_manager [req-1a5b40c1-83f9-4e40-8d57-daab911645d5 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:45:07.813 143787 DEBUG nova.scheduler.host_manager [req-1a5b40c1-83f9-4e40-8d57-daab911645d5 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:45:07.813 143780 DEBUG nova.scheduler.host_manager [req-1a5b40c1-83f9-4e40-8d57-daab911645d5 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:45:07.813 143781 DEBUG oslo_concurrency.lockutils [req-1a5b40c1-83f9-4e40-8d57-daab911645d5 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:45:07.813 143779 DEBUG oslo_concurrency.lockutils [req-1a5b40c1-83f9-4e40-8d57-daab911645d5 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:45:07.813 143787 DEBUG oslo_concurrency.lockutils [req-1a5b40c1-83f9-4e40-8d57-daab911645d5 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:45:07.813 143780 DEBUG oslo_concurrency.lockutils [req-1a5b40c1-83f9-4e40-8d57-daab911645d5 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:45:07.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:45:07.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:45:07.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:07.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:07.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:07.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:07.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:45:07.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:45:07.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:07.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:07.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:07.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:08.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:45:08.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:45:08.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:08.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:08.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:08.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:08.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:45:08.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:45:08.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:08.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:08.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:08.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:10.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:45:10.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:10.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:10.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:45:10.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:45:10.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:45:10.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:10.819 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:10.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:10.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:10.819 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:10.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:14.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:45:14.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:14.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:14.821 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:45:14.821 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:14.821 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:14.821 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:45:14.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:45:14.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:14.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:14.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:14.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:20.095 143781 DEBUG oslo_service.periodic_task [req-ead4427e-e0a7-4422-bd0e-874bc098ced0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:45:20.098 143781 DEBUG oslo_concurrency.lockutils [req-27baaf6f-58fb-4508-938b-ee42b07f977a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:45:20.099 143781 DEBUG oslo_concurrency.lockutils [req-27baaf6f-58fb-4508-938b-ee42b07f977a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:45:22.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:45:22.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:22.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:22.829 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:45:22.829 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:22.829 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:22.830 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:45:22.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:45:22.830 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:22.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:22.830 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:22.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:25.088 143779 DEBUG oslo_service.periodic_task [req-b3b55165-f18f-4bd9-94fc-8a942f9e8a9a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:45:25.091 143779 DEBUG oslo_concurrency.lockutils [req-8809d295-0a46-40ff-b843-036925911e91 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:45:25.092 143779 DEBUG oslo_concurrency.lockutils [req-8809d295-0a46-40ff-b843-036925911e91 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:45:29.058 143780 DEBUG oslo_service.periodic_task [req-7346cf14-6224-422a-a449-a2b35ec14db7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:45:29.062 143780 DEBUG oslo_concurrency.lockutils [req-73d1b0a7-c18b-432c-afce-98aac57dbce0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:45:29.062 143780 DEBUG oslo_concurrency.lockutils [req-73d1b0a7-c18b-432c-afce-98aac57dbce0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:45:32.180 143787 DEBUG oslo_service.periodic_task [req-62b62a63-b8af-418d-b4b6-7d33a746d67a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:45:32.184 143787 DEBUG oslo_concurrency.lockutils [req-cee7bc6d-929f-4419-afb1-5906ec05354e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:45:32.184 143787 DEBUG oslo_concurrency.lockutils [req-cee7bc6d-929f-4419-afb1-5906ec05354e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:45:39.024 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:45:39.025 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:39.025 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:39.026 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:45:39.026 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:39.026 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:39.042 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:45:39.042 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:39.043 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:39.071 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:45:39.071 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:45:39.071 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:45:50.107 143781 DEBUG oslo_service.periodic_task [req-27baaf6f-58fb-4508-938b-ee42b07f977a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:45:50.110 143781 DEBUG oslo_concurrency.lockutils [req-af6c5bf1-2182-4a1e-a71a-051685f7bda2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:45:50.111 143781 DEBUG oslo_concurrency.lockutils [req-af6c5bf1-2182-4a1e-a71a-051685f7bda2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:45:55.097 143779 DEBUG oslo_service.periodic_task [req-8809d295-0a46-40ff-b843-036925911e91 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:45:55.101 143779 DEBUG oslo_concurrency.lockutils [req-dccbd1c7-9f72-4de2-bc25-e727b44a8537 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:45:55.102 143779 DEBUG oslo_concurrency.lockutils [req-dccbd1c7-9f72-4de2-bc25-e727b44a8537 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:45:59.071 143780 DEBUG oslo_service.periodic_task [req-73d1b0a7-c18b-432c-afce-98aac57dbce0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:45:59.075 143780 DEBUG oslo_concurrency.lockutils [req-8ad2b4f9-c65b-474d-8acc-ecdb75c19882 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:45:59.075 143780 DEBUG oslo_concurrency.lockutils [req-8ad2b4f9-c65b-474d-8acc-ecdb75c19882 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:46:03.083 143787 DEBUG oslo_service.periodic_task [req-cee7bc6d-929f-4419-afb1-5906ec05354e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:46:03.087 143787 DEBUG oslo_concurrency.lockutils [req-3479f2e3-9f82-4c63-bf84-47af68a9aa11 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:46:03.088 143787 DEBUG oslo_concurrency.lockutils [req-3479f2e3-9f82-4c63-bf84-47af68a9aa11 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:46:21.041 143781 DEBUG oslo_service.periodic_task [req-af6c5bf1-2182-4a1e-a71a-051685f7bda2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:46:21.045 143781 DEBUG oslo_concurrency.lockutils [req-1c6c045e-e28f-4083-822f-894133be3c65 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:46:21.045 143781 DEBUG oslo_concurrency.lockutils [req-1c6c045e-e28f-4083-822f-894133be3c65 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:46:23.032 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:46:23.032 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:46:23.033 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:46:23.042 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:46:23.042 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:46:23.042 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:46:23.055 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:46:23.056 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:46:23.056 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:46:23.086 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:46:23.086 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:46:23.086 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:46:25.107 143779 DEBUG oslo_service.periodic_task [req-dccbd1c7-9f72-4de2-bc25-e727b44a8537 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:46:25.111 143779 DEBUG oslo_concurrency.lockutils [req-663b25d4-8bba-4431-b18f-680c16d93e55 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:46:25.111 143779 DEBUG oslo_concurrency.lockutils [req-663b25d4-8bba-4431-b18f-680c16d93e55 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:46:30.044 143780 DEBUG oslo_service.periodic_task [req-8ad2b4f9-c65b-474d-8acc-ecdb75c19882 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:46:30.048 143780 DEBUG oslo_concurrency.lockutils [req-c14f19df-4c73-483e-8ca9-e1e054e59775 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:46:30.048 143780 DEBUG oslo_concurrency.lockutils [req-c14f19df-4c73-483e-8ca9-e1e054e59775 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:46:33.101 143787 DEBUG oslo_service.periodic_task [req-3479f2e3-9f82-4c63-bf84-47af68a9aa11 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:46:33.106 143787 DEBUG oslo_concurrency.lockutils [req-54080716-5f5e-45da-a460-b3822e045bfe - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:46:33.107 143787 DEBUG oslo_concurrency.lockutils [req-54080716-5f5e-45da-a460-b3822e045bfe - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:46:51.057 143781 DEBUG oslo_service.periodic_task [req-1c6c045e-e28f-4083-822f-894133be3c65 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:46:51.062 143781 DEBUG oslo_concurrency.lockutils [req-4caa2ff6-20de-478f-b28c-1f6d8dd7e2f6 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:46:51.062 143781 DEBUG oslo_concurrency.lockutils [req-4caa2ff6-20de-478f-b28c-1f6d8dd7e2f6 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:46:55.118 143779 DEBUG oslo_service.periodic_task [req-663b25d4-8bba-4431-b18f-680c16d93e55 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:46:55.123 143779 DEBUG oslo_concurrency.lockutils [req-0b95628f-1534-4cdd-90a0-dab2058a7bfb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:46:55.124 143779 DEBUG oslo_concurrency.lockutils [req-0b95628f-1534-4cdd-90a0-dab2058a7bfb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:47:01.044 143780 DEBUG oslo_service.periodic_task [req-c14f19df-4c73-483e-8ca9-e1e054e59775 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:47:01.048 143780 DEBUG oslo_concurrency.lockutils [req-f1b28d13-c2d3-439d-839a-b68aa835b0cb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:47:01.048 143780 DEBUG oslo_concurrency.lockutils [req-f1b28d13-c2d3-439d-839a-b68aa835b0cb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:47:03.121 143787 DEBUG oslo_service.periodic_task [req-54080716-5f5e-45da-a460-b3822e045bfe - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:47:03.125 143787 DEBUG oslo_concurrency.lockutils [req-648594f2-1695-4d47-943c-d7f6b51224cc - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:47:03.126 143787 DEBUG oslo_concurrency.lockutils [req-648594f2-1695-4d47-943c-d7f6b51224cc - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:47:12.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 3fa4940c47cc42caac50dfbfa3c1180c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:47:12.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 3fa4940c47cc42caac50dfbfa3c1180c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:47:12.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 3fa4940c47cc42caac50dfbfa3c1180c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:47:12.824 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 3fa4940c47cc42caac50dfbfa3c1180c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:47:12.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:12.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3fa4940c47cc42caac50dfbfa3c1180c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:47:12.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:12.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:12.824 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:12.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3fa4940c47cc42caac50dfbfa3c1180c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:47:12.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:12.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3fa4940c47cc42caac50dfbfa3c1180c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:47:12.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:12.824 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3fa4940c47cc42caac50dfbfa3c1180c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:47:12.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:12.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:12.825 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:12.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:12.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:12.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:12.825 143787 DEBUG oslo_concurrency.lockutils [req-8ac7b1db-9474-4e1e-ae4c-f12bd6198e3e - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:47:12.825 143779 DEBUG oslo_concurrency.lockutils [req-8ac7b1db-9474-4e1e-ae4c-f12bd6198e3e - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:47:12.825 143787 DEBUG nova.scheduler.host_manager [req-8ac7b1db-9474-4e1e-ae4c-f12bd6198e3e - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:47:12.825 143779 DEBUG nova.scheduler.host_manager [req-8ac7b1db-9474-4e1e-ae4c-f12bd6198e3e - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:47:12.825 143781 DEBUG oslo_concurrency.lockutils [req-8ac7b1db-9474-4e1e-ae4c-f12bd6198e3e - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:47:12.825 143787 DEBUG oslo_concurrency.lockutils [req-8ac7b1db-9474-4e1e-ae4c-f12bd6198e3e - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:47:12.826 143779 DEBUG oslo_concurrency.lockutils [req-8ac7b1db-9474-4e1e-ae4c-f12bd6198e3e - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:47:12.826 143780 DEBUG oslo_concurrency.lockutils [req-8ac7b1db-9474-4e1e-ae4c-f12bd6198e3e - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:47:12.826 143781 DEBUG nova.scheduler.host_manager [req-8ac7b1db-9474-4e1e-ae4c-f12bd6198e3e - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:47:12.826 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:47:12.826 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:12.826 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:12.826 143781 DEBUG oslo_concurrency.lockutils [req-8ac7b1db-9474-4e1e-ae4c-f12bd6198e3e - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:47:12.826 143780 DEBUG nova.scheduler.host_manager [req-8ac7b1db-9474-4e1e-ae4c-f12bd6198e3e - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:47:12.826 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:47:12.826 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:12.826 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:12.826 143780 DEBUG oslo_concurrency.lockutils [req-8ac7b1db-9474-4e1e-ae4c-f12bd6198e3e - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:47:12.826 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:47:12.826 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:12.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:12.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:47:12.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:12.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:13.828 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:47:13.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:47:13.828 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:47:13.828 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:13.828 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:47:13.828 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:13.828 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:13.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:13.828 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:13.828 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:13.828 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:13.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:15.829 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:47:15.830 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:15.830 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:15.830 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:47:15.830 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:15.830 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:15.831 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:47:15.831 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:47:15.831 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:15.831 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:15.831 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:15.831 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:19.833 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:47:19.834 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:19.834 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:19.834 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:47:19.834 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:47:19.834 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:19.834 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:19.835 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:19.835 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:19.835 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:47:19.835 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:19.835 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:22.041 143781 DEBUG oslo_service.periodic_task [req-4caa2ff6-20de-478f-b28c-1f6d8dd7e2f6 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:47:22.046 143781 DEBUG oslo_concurrency.lockutils [req-0f056e0f-26e9-4993-9f08-e161326746e2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:47:22.046 143781 DEBUG oslo_concurrency.lockutils [req-0f056e0f-26e9-4993-9f08-e161326746e2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:47:25.131 143779 DEBUG oslo_service.periodic_task [req-0b95628f-1534-4cdd-90a0-dab2058a7bfb - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:47:25.136 143779 DEBUG oslo_concurrency.lockutils [req-8515696f-84a8-409d-b68e-980bde1ecfdb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:47:25.136 143779 DEBUG oslo_concurrency.lockutils [req-8515696f-84a8-409d-b68e-980bde1ecfdb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:47:27.836 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:47:27.836 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:47:27.837 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:27.837 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:27.837 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:27.837 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:27.837 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:47:27.838 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:27.838 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:27.841 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:47:27.841 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:27.841 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:31.055 143780 DEBUG oslo_service.periodic_task [req-f1b28d13-c2d3-439d-839a-b68aa835b0cb - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:47:31.059 143780 DEBUG oslo_concurrency.lockutils [req-428b3906-7b6c-40de-b9fd-ca3633eb444f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:47:31.059 143780 DEBUG oslo_concurrency.lockutils [req-428b3906-7b6c-40de-b9fd-ca3633eb444f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:47:34.083 143787 DEBUG oslo_service.periodic_task [req-648594f2-1695-4d47-943c-d7f6b51224cc - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:47:34.086 143787 DEBUG oslo_concurrency.lockutils [req-825611e2-03d6-44b6-abc7-c70f31d868d7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:47:34.087 143787 DEBUG oslo_concurrency.lockutils [req-825611e2-03d6-44b6-abc7-c70f31d868d7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:47:43.838 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:47:43.838 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:43.838 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:43.839 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:47:43.839 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:47:43.839 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:43.839 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:43.839 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:43.840 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:43.842 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:47:43.842 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:47:43.843 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:47:52.058 143781 DEBUG oslo_service.periodic_task [req-0f056e0f-26e9-4993-9f08-e161326746e2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:47:52.062 143781 DEBUG oslo_concurrency.lockutils [req-afd6fc94-7b17-4bb8-8ff2-9c400822052e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:47:52.062 143781 DEBUG oslo_concurrency.lockutils [req-afd6fc94-7b17-4bb8-8ff2-9c400822052e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:47:55.142 143779 DEBUG oslo_service.periodic_task [req-8515696f-84a8-409d-b68e-980bde1ecfdb - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:47:55.148 143779 DEBUG oslo_concurrency.lockutils [req-c6f85ae9-c876-4082-976f-9774af6f4b25 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:47:55.149 143779 DEBUG oslo_concurrency.lockutils [req-c6f85ae9-c876-4082-976f-9774af6f4b25 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:48:01.065 143780 DEBUG oslo_service.periodic_task [req-428b3906-7b6c-40de-b9fd-ca3633eb444f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:48:01.069 143780 DEBUG oslo_concurrency.lockutils [req-bd3d9dee-0102-4d0b-9f7e-cdf7cedbd977 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:48:01.070 143780 DEBUG oslo_concurrency.lockutils [req-bd3d9dee-0102-4d0b-9f7e-cdf7cedbd977 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:48:04.092 143787 DEBUG oslo_service.periodic_task [req-825611e2-03d6-44b6-abc7-c70f31d868d7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:48:04.096 143787 DEBUG oslo_concurrency.lockutils [req-9f0f0f3a-2bf6-402f-a374-eed6f81e0a7c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:48:04.096 143787 DEBUG oslo_concurrency.lockutils [req-9f0f0f3a-2bf6-402f-a374-eed6f81e0a7c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:48:22.073 143781 DEBUG oslo_service.periodic_task [req-afd6fc94-7b17-4bb8-8ff2-9c400822052e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:48:22.077 143781 DEBUG oslo_concurrency.lockutils [req-7d11a4ab-7ceb-498c-b1ad-66e550d0da40 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:48:22.077 143781 DEBUG oslo_concurrency.lockutils [req-7d11a4ab-7ceb-498c-b1ad-66e550d0da40 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:48:23.035 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:48:23.035 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:48:23.036 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:48:23.045 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:48:23.046 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:48:23.046 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:48:23.060 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:48:23.060 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:48:23.060 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:48:23.090 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:48:23.090 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:48:23.090 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:48:26.057 143779 DEBUG oslo_service.periodic_task [req-c6f85ae9-c876-4082-976f-9774af6f4b25 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:48:26.061 143779 DEBUG oslo_concurrency.lockutils [req-f718b1a5-3245-4f5c-85b8-d38d86e180ed - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:48:26.061 143779 DEBUG oslo_concurrency.lockutils [req-f718b1a5-3245-4f5c-85b8-d38d86e180ed - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:48:31.083 143780 DEBUG oslo_service.periodic_task [req-bd3d9dee-0102-4d0b-9f7e-cdf7cedbd977 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:48:31.087 143780 DEBUG oslo_concurrency.lockutils [req-f694d826-9564-42d2-8585-bfa5016754d2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:48:31.087 143780 DEBUG oslo_concurrency.lockutils [req-f694d826-9564-42d2-8585-bfa5016754d2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:48:35.084 143787 DEBUG oslo_service.periodic_task [req-9f0f0f3a-2bf6-402f-a374-eed6f81e0a7c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:48:35.088 143787 DEBUG oslo_concurrency.lockutils [req-43aa440c-278d-4ab1-9eca-7a8bb9991999 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:48:35.088 143787 DEBUG oslo_concurrency.lockutils [req-43aa440c-278d-4ab1-9eca-7a8bb9991999 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:48:53.041 143781 DEBUG oslo_service.periodic_task [req-7d11a4ab-7ceb-498c-b1ad-66e550d0da40 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:48:53.045 143781 DEBUG oslo_concurrency.lockutils [req-eb75b05e-7525-4e58-a11e-3d37809bc5db - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:48:53.046 143781 DEBUG oslo_concurrency.lockutils [req-eb75b05e-7525-4e58-a11e-3d37809bc5db - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:48:56.068 143779 DEBUG oslo_service.periodic_task [req-f718b1a5-3245-4f5c-85b8-d38d86e180ed - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:48:56.072 143779 DEBUG oslo_concurrency.lockutils [req-1d600fab-c06c-45a1-aa13-935861f362e4 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:48:56.073 143779 DEBUG oslo_concurrency.lockutils [req-1d600fab-c06c-45a1-aa13-935861f362e4 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:49:01.099 143780 DEBUG oslo_service.periodic_task [req-f694d826-9564-42d2-8585-bfa5016754d2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:49:01.103 143780 DEBUG oslo_concurrency.lockutils [req-868f43c9-a2ca-4934-b942-e7279b72f001 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:49:01.103 143780 DEBUG oslo_concurrency.lockutils [req-868f43c9-a2ca-4934-b942-e7279b72f001 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:49:05.093 143787 DEBUG oslo_service.periodic_task [req-43aa440c-278d-4ab1-9eca-7a8bb9991999 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:49:05.097 143787 DEBUG oslo_concurrency.lockutils [req-e2a7713a-39db-4add-ad78-5e855f441fac - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:49:05.097 143787 DEBUG oslo_concurrency.lockutils [req-e2a7713a-39db-4add-ad78-5e855f441fac - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:49:14.932 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8d8bfc2433c74a84b82a364d79141e95 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:49:14.932 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8d8bfc2433c74a84b82a364d79141e95 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:49:14.932 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8d8bfc2433c74a84b82a364d79141e95 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:49:14.932 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:14.932 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:14.932 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8d8bfc2433c74a84b82a364d79141e95 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:49:14.932 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8d8bfc2433c74a84b82a364d79141e95 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:49:14.932 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:14.933 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:14.933 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:14.933 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8d8bfc2433c74a84b82a364d79141e95 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:49:14.933 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:14.933 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:14.933 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:14.933 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:14.933 143779 DEBUG oslo_concurrency.lockutils [req-45244ef9-6aae-468f-95f1-7b203c3c2eee - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:49:14.933 143787 DEBUG oslo_concurrency.lockutils [req-45244ef9-6aae-468f-95f1-7b203c3c2eee - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:49:14.934 143779 DEBUG nova.scheduler.host_manager [req-45244ef9-6aae-468f-95f1-7b203c3c2eee - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:49:14.934 143787 DEBUG nova.scheduler.host_manager [req-45244ef9-6aae-468f-95f1-7b203c3c2eee - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:49:14.934 143781 DEBUG oslo_concurrency.lockutils [req-45244ef9-6aae-468f-95f1-7b203c3c2eee - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:49:14.934 143787 DEBUG oslo_concurrency.lockutils [req-45244ef9-6aae-468f-95f1-7b203c3c2eee - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:49:14.934 143779 DEBUG oslo_concurrency.lockutils [req-45244ef9-6aae-468f-95f1-7b203c3c2eee - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:49:14.934 143781 DEBUG nova.scheduler.host_manager [req-45244ef9-6aae-468f-95f1-7b203c3c2eee - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:49:14.934 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:49:14.934 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:49:14.934 143781 DEBUG oslo_concurrency.lockutils [req-45244ef9-6aae-468f-95f1-7b203c3c2eee - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:49:14.934 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:14.934 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:14.934 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:14.934 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:14.934 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:49:14.934 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:14.935 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:14.935 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8d8bfc2433c74a84b82a364d79141e95 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:49:14.936 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:14.936 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8d8bfc2433c74a84b82a364d79141e95 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:49:14.936 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:14.936 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:14.936 143780 DEBUG oslo_concurrency.lockutils [req-45244ef9-6aae-468f-95f1-7b203c3c2eee - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:49:14.937 143780 DEBUG nova.scheduler.host_manager [req-45244ef9-6aae-468f-95f1-7b203c3c2eee - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:49:14.937 143780 DEBUG oslo_concurrency.lockutils [req-45244ef9-6aae-468f-95f1-7b203c3c2eee - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:49:14.938 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:49:14.939 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:14.939 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:15.935 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:49:15.935 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:49:15.936 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:15.936 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:15.936 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:15.936 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:49:15.936 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:15.936 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:15.936 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:15.939 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:49:15.939 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:15.940 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:17.938 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:49:17.938 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:17.938 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:17.938 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:49:17.939 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:17.938 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:49:17.939 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:17.939 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:17.939 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:17.941 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:49:17.941 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:17.942 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:21.941 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:49:21.942 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:49:21.942 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:21.942 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:21.942 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:21.942 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:21.943 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:49:21.943 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:21.943 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:21.945 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:49:21.945 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:21.945 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:23.051 143781 DEBUG oslo_service.periodic_task [req-eb75b05e-7525-4e58-a11e-3d37809bc5db - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:49:23.055 143781 DEBUG oslo_concurrency.lockutils [req-1b7d7ae5-907d-4a04-8471-10afc202f63b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:49:23.055 143781 DEBUG oslo_concurrency.lockutils [req-1b7d7ae5-907d-4a04-8471-10afc202f63b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:49:27.057 143779 DEBUG oslo_service.periodic_task [req-1d600fab-c06c-45a1-aa13-935861f362e4 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:49:27.061 143779 DEBUG oslo_concurrency.lockutils [req-75313867-ac9d-4e02-afbf-edc2f185de15 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:49:27.062 143779 DEBUG oslo_concurrency.lockutils [req-75313867-ac9d-4e02-afbf-edc2f185de15 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:49:29.946 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:49:29.947 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:29.947 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:29.949 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:49:29.950 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:29.950 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:29.950 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:49:29.950 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:29.950 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:29.951 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:49:29.952 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:29.952 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:31.108 143780 DEBUG oslo_service.periodic_task [req-868f43c9-a2ca-4934-b942-e7279b72f001 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:49:31.112 143780 DEBUG oslo_concurrency.lockutils [req-7e01a13a-38a2-41e9-be66-695ddeb5316e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:49:31.112 143780 DEBUG oslo_concurrency.lockutils [req-7e01a13a-38a2-41e9-be66-695ddeb5316e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:49:35.103 143787 DEBUG oslo_service.periodic_task [req-e2a7713a-39db-4add-ad78-5e855f441fac - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:49:35.107 143787 DEBUG oslo_concurrency.lockutils [req-8009e053-22e7-47e3-85c9-66c5a758be7b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:49:35.107 143787 DEBUG oslo_concurrency.lockutils [req-8009e053-22e7-47e3-85c9-66c5a758be7b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:49:45.948 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:49:45.949 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:45.949 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:45.951 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:49:45.952 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:45.952 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:45.952 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:49:45.952 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:45.952 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:45.953 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:49:45.953 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:49:45.954 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:49:54.041 143781 DEBUG oslo_service.periodic_task [req-1b7d7ae5-907d-4a04-8471-10afc202f63b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:49:54.046 143781 DEBUG oslo_concurrency.lockutils [req-0e214c89-2a75-4e44-9581-934e36aac4ba - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:49:54.046 143781 DEBUG oslo_concurrency.lockutils [req-0e214c89-2a75-4e44-9581-934e36aac4ba - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:49:57.070 143779 DEBUG oslo_service.periodic_task [req-75313867-ac9d-4e02-afbf-edc2f185de15 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:49:57.074 143779 DEBUG oslo_concurrency.lockutils [req-56fba28e-e611-42ea-b8a6-ca96605b8202 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:49:57.074 143779 DEBUG oslo_concurrency.lockutils [req-56fba28e-e611-42ea-b8a6-ca96605b8202 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:50:01.116 143780 DEBUG oslo_service.periodic_task [req-7e01a13a-38a2-41e9-be66-695ddeb5316e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:50:01.121 143780 DEBUG oslo_concurrency.lockutils [req-1aad0c30-cb7f-4269-8b16-f3d290a91f04 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:50:01.121 143780 DEBUG oslo_concurrency.lockutils [req-1aad0c30-cb7f-4269-8b16-f3d290a91f04 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:50:05.113 143787 DEBUG oslo_service.periodic_task [req-8009e053-22e7-47e3-85c9-66c5a758be7b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:50:05.117 143787 DEBUG oslo_concurrency.lockutils [req-9fe4d701-f784-4e69-a9d7-89a7343a0cc3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:50:05.118 143787 DEBUG oslo_concurrency.lockutils [req-9fe4d701-f784-4e69-a9d7-89a7343a0cc3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:50:23.043 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:50:23.044 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:50:23.044 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:50:23.047 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:50:23.047 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:50:23.047 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:50:23.060 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:50:23.060 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:50:23.060 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:50:23.092 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:50:23.092 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:50:23.092 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:50:24.052 143781 DEBUG oslo_service.periodic_task [req-0e214c89-2a75-4e44-9581-934e36aac4ba - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:50:24.056 143781 DEBUG oslo_concurrency.lockutils [req-824fe0cc-8ee9-48c9-8c7f-ff2fa34dc91c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:50:24.056 143781 DEBUG oslo_concurrency.lockutils [req-824fe0cc-8ee9-48c9-8c7f-ff2fa34dc91c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:50:28.057 143779 DEBUG oslo_service.periodic_task [req-56fba28e-e611-42ea-b8a6-ca96605b8202 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:50:28.061 143779 DEBUG oslo_concurrency.lockutils [req-095e51a7-f295-47eb-811e-699a63207784 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:50:28.062 143779 DEBUG oslo_concurrency.lockutils [req-095e51a7-f295-47eb-811e-699a63207784 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:50:31.132 143780 DEBUG oslo_service.periodic_task [req-1aad0c30-cb7f-4269-8b16-f3d290a91f04 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:50:31.136 143780 DEBUG oslo_concurrency.lockutils [req-6c5ff8ce-d887-4e50-bdf6-431741766a22 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:50:31.136 143780 DEBUG oslo_concurrency.lockutils [req-6c5ff8ce-d887-4e50-bdf6-431741766a22 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:50:35.123 143787 DEBUG oslo_service.periodic_task [req-9fe4d701-f784-4e69-a9d7-89a7343a0cc3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:50:35.127 143787 DEBUG oslo_concurrency.lockutils [req-1531f4fc-e629-46f3-9be5-147c1caf3c87 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:50:35.127 143787 DEBUG oslo_concurrency.lockutils [req-1531f4fc-e629-46f3-9be5-147c1caf3c87 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:50:55.042 143781 DEBUG oslo_service.periodic_task [req-824fe0cc-8ee9-48c9-8c7f-ff2fa34dc91c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:50:55.046 143781 DEBUG oslo_concurrency.lockutils [req-4fa66399-e22f-4c1a-bc57-0fa1793db92f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:50:55.046 143781 DEBUG oslo_concurrency.lockutils [req-4fa66399-e22f-4c1a-bc57-0fa1793db92f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:50:59.058 143779 DEBUG oslo_service.periodic_task [req-095e51a7-f295-47eb-811e-699a63207784 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:50:59.062 143779 DEBUG oslo_concurrency.lockutils [req-6f033d9d-c965-466e-9d6f-68c76b5254d7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:50:59.062 143779 DEBUG oslo_concurrency.lockutils [req-6f033d9d-c965-466e-9d6f-68c76b5254d7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:51:01.146 143780 DEBUG oslo_service.periodic_task [req-6c5ff8ce-d887-4e50-bdf6-431741766a22 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:51:01.150 143780 DEBUG oslo_concurrency.lockutils [req-cf82afc8-7889-41e5-9fe7-7b6c95eb867a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:51:01.150 143780 DEBUG oslo_concurrency.lockutils [req-cf82afc8-7889-41e5-9fe7-7b6c95eb867a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:51:05.132 143787 DEBUG oslo_service.periodic_task [req-1531f4fc-e629-46f3-9be5-147c1caf3c87 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:51:05.137 143787 DEBUG oslo_concurrency.lockutils [req-dd413051-0fc9-4a1d-89b2-e7b465546e2f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:51:05.137 143787 DEBUG oslo_concurrency.lockutils [req-dd413051-0fc9-4a1d-89b2-e7b465546e2f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:51:14.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 9907ea299aec45f6be4e50c6106da11a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:51:14.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 9907ea299aec45f6be4e50c6106da11a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:51:14.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 9907ea299aec45f6be4e50c6106da11a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:51:14.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 9907ea299aec45f6be4e50c6106da11a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:51:14.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:14.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:14.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:14.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9907ea299aec45f6be4e50c6106da11a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:51:14.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9907ea299aec45f6be4e50c6106da11a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:51:14.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9907ea299aec45f6be4e50c6106da11a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:51:14.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:14.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:14.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:14.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:14.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:14.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9907ea299aec45f6be4e50c6106da11a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:51:14.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:14.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:14.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:14.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:14.818 143781 DEBUG oslo_concurrency.lockutils [req-b1922b67-13cf-4fec-869a-72d7404c3022 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:51:14.818 143780 DEBUG oslo_concurrency.lockutils [req-b1922b67-13cf-4fec-869a-72d7404c3022 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:51:14.818 143787 DEBUG oslo_concurrency.lockutils [req-b1922b67-13cf-4fec-869a-72d7404c3022 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:51:14.818 143781 DEBUG nova.scheduler.host_manager [req-b1922b67-13cf-4fec-869a-72d7404c3022 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:51:14.819 143780 DEBUG nova.scheduler.host_manager [req-b1922b67-13cf-4fec-869a-72d7404c3022 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:51:14.819 143787 DEBUG nova.scheduler.host_manager [req-b1922b67-13cf-4fec-869a-72d7404c3022 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:51:14.819 143781 DEBUG oslo_concurrency.lockutils [req-b1922b67-13cf-4fec-869a-72d7404c3022 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:51:14.819 143780 DEBUG oslo_concurrency.lockutils [req-b1922b67-13cf-4fec-869a-72d7404c3022 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:51:14.819 143787 DEBUG oslo_concurrency.lockutils [req-b1922b67-13cf-4fec-869a-72d7404c3022 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:51:14.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:51:14.819 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:51:14.819 143779 DEBUG oslo_concurrency.lockutils [req-b1922b67-13cf-4fec-869a-72d7404c3022 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:51:14.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:14.819 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:14.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:14.819 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:14.819 143779 DEBUG nova.scheduler.host_manager [req-b1922b67-13cf-4fec-869a-72d7404c3022 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:51:14.820 143779 DEBUG oslo_concurrency.lockutils [req-b1922b67-13cf-4fec-869a-72d7404c3022 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:51:14.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:51:14.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:51:14.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:14.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:14.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:14.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:15.821 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:51:15.821 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:51:15.821 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:15.821 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:15.821 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:15.821 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:15.821 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:51:15.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:15.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:15.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:51:15.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:15.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:17.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:51:17.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:17.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:17.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:51:17.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:17.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:17.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:51:17.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:17.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:17.825 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:51:17.825 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:17.825 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:21.828 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:51:21.828 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:21.828 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:21.829 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:51:21.829 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:51:21.829 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:21.829 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:21.829 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:21.829 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:21.829 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:51:21.830 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:21.830 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:25.051 143781 DEBUG oslo_service.periodic_task [req-4fa66399-e22f-4c1a-bc57-0fa1793db92f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:51:25.055 143781 DEBUG oslo_concurrency.lockutils [req-8f5dd07a-eb71-41d7-95e2-897f90e64efb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:51:25.055 143781 DEBUG oslo_concurrency.lockutils [req-8f5dd07a-eb71-41d7-95e2-897f90e64efb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:51:29.832 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:51:29.832 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:29.832 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:29.833 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:51:29.833 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:51:29.834 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:29.834 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:29.834 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:29.834 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:29.836 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:51:29.836 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:29.836 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:30.057 143779 DEBUG oslo_service.periodic_task [req-6f033d9d-c965-466e-9d6f-68c76b5254d7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:51:30.061 143779 DEBUG oslo_concurrency.lockutils [req-06fc941f-5a3e-49f3-9f0d-c8e0205fb0f5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:51:30.061 143779 DEBUG oslo_concurrency.lockutils [req-06fc941f-5a3e-49f3-9f0d-c8e0205fb0f5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:51:31.156 143780 DEBUG oslo_service.periodic_task [req-cf82afc8-7889-41e5-9fe7-7b6c95eb867a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:51:31.161 143780 DEBUG oslo_concurrency.lockutils [req-221f5479-8130-4e1a-b9cc-853c36ca8792 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:51:31.161 143780 DEBUG oslo_concurrency.lockutils [req-221f5479-8130-4e1a-b9cc-853c36ca8792 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:51:35.143 143787 DEBUG oslo_service.periodic_task [req-dd413051-0fc9-4a1d-89b2-e7b465546e2f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:51:35.147 143787 DEBUG oslo_concurrency.lockutils [req-eabe02d7-52c8-4daa-a783-e76afc152ac1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:51:35.147 143787 DEBUG oslo_concurrency.lockutils [req-eabe02d7-52c8-4daa-a783-e76afc152ac1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:51:45.834 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:51:45.834 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:45.834 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:45.835 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:51:45.835 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:45.835 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:51:45.835 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:45.836 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:45.836 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:45.838 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:51:45.838 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:51:45.839 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:51:56.041 143781 DEBUG oslo_service.periodic_task [req-8f5dd07a-eb71-41d7-95e2-897f90e64efb - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:51:56.045 143781 DEBUG oslo_concurrency.lockutils [req-0a375ee7-2b85-4a9d-8d62-338f401ae2b9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:51:56.045 143781 DEBUG oslo_concurrency.lockutils [req-0a375ee7-2b85-4a9d-8d62-338f401ae2b9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:52:00.072 143779 DEBUG oslo_service.periodic_task [req-06fc941f-5a3e-49f3-9f0d-c8e0205fb0f5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:52:00.076 143779 DEBUG oslo_concurrency.lockutils [req-525dab76-c9ef-40bb-b647-b01a5d54ce73 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:52:00.077 143779 DEBUG oslo_concurrency.lockutils [req-525dab76-c9ef-40bb-b647-b01a5d54ce73 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:52:01.165 143780 DEBUG oslo_service.periodic_task [req-221f5479-8130-4e1a-b9cc-853c36ca8792 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:52:01.170 143780 DEBUG oslo_concurrency.lockutils [req-2ed4398b-3375-48f6-ba2c-bd7521e5f056 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:52:01.170 143780 DEBUG oslo_concurrency.lockutils [req-2ed4398b-3375-48f6-ba2c-bd7521e5f056 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:52:06.083 143787 DEBUG oslo_service.periodic_task [req-eabe02d7-52c8-4daa-a783-e76afc152ac1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:52:06.087 143787 DEBUG oslo_concurrency.lockutils [req-f81ad487-312c-4213-bed6-d263e72d1ce1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:52:06.087 143787 DEBUG oslo_concurrency.lockutils [req-f81ad487-312c-4213-bed6-d263e72d1ce1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:52:23.048 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:52:23.048 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:52:23.048 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:52:23.051 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:52:23.051 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:52:23.051 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:52:23.066 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:52:23.066 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:52:23.066 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:52:23.094 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:52:23.094 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:52:23.094 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:52:26.052 143781 DEBUG oslo_service.periodic_task [req-0a375ee7-2b85-4a9d-8d62-338f401ae2b9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:52:26.056 143781 DEBUG oslo_concurrency.lockutils [req-ad321d67-2c53-40df-a2f5-6285a2b3a5e1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:52:26.056 143781 DEBUG oslo_concurrency.lockutils [req-ad321d67-2c53-40df-a2f5-6285a2b3a5e1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:52:30.087 143779 DEBUG oslo_service.periodic_task [req-525dab76-c9ef-40bb-b647-b01a5d54ce73 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:52:30.091 143779 DEBUG oslo_concurrency.lockutils [req-98746634-14fb-4572-a134-983c32bc6085 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:52:30.091 143779 DEBUG oslo_concurrency.lockutils [req-98746634-14fb-4572-a134-983c32bc6085 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:52:31.181 143780 DEBUG oslo_service.periodic_task [req-2ed4398b-3375-48f6-ba2c-bd7521e5f056 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:52:31.185 143780 DEBUG oslo_concurrency.lockutils [req-cba608af-b61d-4113-b890-a0a02b2fc57f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:52:31.185 143780 DEBUG oslo_concurrency.lockutils [req-cba608af-b61d-4113-b890-a0a02b2fc57f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:52:36.094 143787 DEBUG oslo_service.periodic_task [req-f81ad487-312c-4213-bed6-d263e72d1ce1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:52:36.097 143787 DEBUG oslo_concurrency.lockutils [req-60ea64cc-d161-4ccc-b481-dcfe77c992cf - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:52:36.098 143787 DEBUG oslo_concurrency.lockutils [req-60ea64cc-d161-4ccc-b481-dcfe77c992cf - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:52:57.041 143781 DEBUG oslo_service.periodic_task [req-ad321d67-2c53-40df-a2f5-6285a2b3a5e1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:52:57.046 143781 DEBUG oslo_concurrency.lockutils [req-c5865a1a-61e0-4ee8-9299-9b8cd4134e2d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:52:57.046 143781 DEBUG oslo_concurrency.lockutils [req-c5865a1a-61e0-4ee8-9299-9b8cd4134e2d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:53:00.101 143779 DEBUG oslo_service.periodic_task [req-98746634-14fb-4572-a134-983c32bc6085 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:53:00.106 143779 DEBUG oslo_concurrency.lockutils [req-bfe5af3d-0349-4a0f-a27e-0cdbd991ab86 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:53:00.106 143779 DEBUG oslo_concurrency.lockutils [req-bfe5af3d-0349-4a0f-a27e-0cdbd991ab86 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:53:02.044 143780 DEBUG oslo_service.periodic_task [req-cba608af-b61d-4113-b890-a0a02b2fc57f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:53:02.048 143780 DEBUG oslo_concurrency.lockutils [req-12fb3bc1-0842-4d5e-8ba5-010c7438e938 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:53:02.048 143780 DEBUG oslo_concurrency.lockutils [req-12fb3bc1-0842-4d5e-8ba5-010c7438e938 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:53:06.103 143787 DEBUG oslo_service.periodic_task [req-60ea64cc-d161-4ccc-b481-dcfe77c992cf - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:53:06.107 143787 DEBUG oslo_concurrency.lockutils [req-49337ae5-147a-4749-84f6-d444697cdfd5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:53:06.108 143787 DEBUG oslo_concurrency.lockutils [req-49337ae5-147a-4749-84f6-d444697cdfd5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:53:14.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 86f5d8bb23b74bfea15fb78cbecf2f87 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:53:14.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 86f5d8bb23b74bfea15fb78cbecf2f87 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:53:14.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 86f5d8bb23b74bfea15fb78cbecf2f87 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:53:14.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 86f5d8bb23b74bfea15fb78cbecf2f87 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:53:14.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:14.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:14.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:14.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 86f5d8bb23b74bfea15fb78cbecf2f87 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:53:14.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 86f5d8bb23b74bfea15fb78cbecf2f87 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:53:14.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 86f5d8bb23b74bfea15fb78cbecf2f87 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:53:14.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:14.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:14.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:14.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:14.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:14.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:14.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 86f5d8bb23b74bfea15fb78cbecf2f87 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:53:14.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:14.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:14.815 143780 DEBUG oslo_concurrency.lockutils [req-78392615-6d28-479d-9975-8bfd9c8bf7bb - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:53:14.815 143779 DEBUG oslo_concurrency.lockutils [req-78392615-6d28-479d-9975-8bfd9c8bf7bb - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:53:14.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:14.815 143780 DEBUG nova.scheduler.host_manager [req-78392615-6d28-479d-9975-8bfd9c8bf7bb - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:53:14.815 143779 DEBUG nova.scheduler.host_manager [req-78392615-6d28-479d-9975-8bfd9c8bf7bb - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:53:14.815 143780 DEBUG oslo_concurrency.lockutils [req-78392615-6d28-479d-9975-8bfd9c8bf7bb - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:53:14.815 143779 DEBUG oslo_concurrency.lockutils [req-78392615-6d28-479d-9975-8bfd9c8bf7bb - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:53:14.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:53:14.816 143787 DEBUG oslo_concurrency.lockutils [req-78392615-6d28-479d-9975-8bfd9c8bf7bb - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:53:14.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:14.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:53:14.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:14.816 143781 DEBUG oslo_concurrency.lockutils [req-78392615-6d28-479d-9975-8bfd9c8bf7bb - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:53:14.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:14.816 143787 DEBUG nova.scheduler.host_manager [req-78392615-6d28-479d-9975-8bfd9c8bf7bb - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:53:14.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:14.816 143787 DEBUG oslo_concurrency.lockutils [req-78392615-6d28-479d-9975-8bfd9c8bf7bb - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:53:14.816 143781 DEBUG nova.scheduler.host_manager [req-78392615-6d28-479d-9975-8bfd9c8bf7bb - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:53:14.816 143781 DEBUG oslo_concurrency.lockutils [req-78392615-6d28-479d-9975-8bfd9c8bf7bb - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:53:14.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:53:14.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:14.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:14.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:53:14.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:14.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:15.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:53:15.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:53:15.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:15.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:15.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:53:15.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:15.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:15.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:15.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:15.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:53:15.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:15.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:17.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:53:17.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:17.819 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:53:17.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:17.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:17.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:17.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:53:17.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:17.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:17.821 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:53:17.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:17.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:21.823 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:53:21.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:21.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:53:21.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:21.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:21.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:21.824 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:53:21.824 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:21.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:21.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:53:21.826 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:21.826 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:27.053 143781 DEBUG oslo_service.periodic_task [req-c5865a1a-61e0-4ee8-9299-9b8cd4134e2d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:53:27.057 143781 DEBUG oslo_concurrency.lockutils [req-67fa1bf9-7d60-4794-9b6d-7527779c122c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:53:27.058 143781 DEBUG oslo_concurrency.lockutils [req-67fa1bf9-7d60-4794-9b6d-7527779c122c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:53:29.829 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:53:29.830 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:29.830 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:29.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:53:29.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:29.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:29.830 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:53:29.831 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:29.831 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:29.831 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:53:29.832 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:29.832 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:30.112 143779 DEBUG oslo_service.periodic_task [req-bfe5af3d-0349-4a0f-a27e-0cdbd991ab86 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:53:30.116 143779 DEBUG oslo_concurrency.lockutils [req-48d6a994-1722-41b9-b41d-da3bad1806ab - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:53:30.116 143779 DEBUG oslo_concurrency.lockutils [req-48d6a994-1722-41b9-b41d-da3bad1806ab - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:53:32.054 143780 DEBUG oslo_service.periodic_task [req-12fb3bc1-0842-4d5e-8ba5-010c7438e938 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:53:32.059 143780 DEBUG oslo_concurrency.lockutils [req-09960f70-3b1d-4048-bfb5-96eeb0586999 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:53:32.059 143780 DEBUG oslo_concurrency.lockutils [req-09960f70-3b1d-4048-bfb5-96eeb0586999 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:53:37.083 143787 DEBUG oslo_service.periodic_task [req-49337ae5-147a-4749-84f6-d444697cdfd5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:53:37.087 143787 DEBUG oslo_concurrency.lockutils [req-0a47095f-c528-4f73-b77e-75d1183ea627 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:53:37.088 143787 DEBUG oslo_concurrency.lockutils [req-0a47095f-c528-4f73-b77e-75d1183ea627 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:53:45.832 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:53:45.832 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:53:45.832 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:53:45.832 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:45.832 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:45.832 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:45.832 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:45.832 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:45.832 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:45.833 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:53:45.834 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:53:45.834 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:53:57.064 143781 DEBUG oslo_service.periodic_task [req-67fa1bf9-7d60-4794-9b6d-7527779c122c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:53:57.068 143781 DEBUG oslo_concurrency.lockutils [req-b846e796-faa5-41f3-b2a7-b2c7f71cd65c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:53:57.068 143781 DEBUG oslo_concurrency.lockutils [req-b846e796-faa5-41f3-b2a7-b2c7f71cd65c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:54:00.127 143779 DEBUG oslo_service.periodic_task [req-48d6a994-1722-41b9-b41d-da3bad1806ab - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:54:00.131 143779 DEBUG oslo_concurrency.lockutils [req-43b25a8b-f55f-4005-8d49-a4dae431e371 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:54:00.131 143779 DEBUG oslo_concurrency.lockutils [req-43b25a8b-f55f-4005-8d49-a4dae431e371 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:54:02.064 143780 DEBUG oslo_service.periodic_task [req-09960f70-3b1d-4048-bfb5-96eeb0586999 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:54:02.069 143780 DEBUG oslo_concurrency.lockutils [req-9ca399f7-c009-4fff-a021-772917296ca7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:54:02.069 143780 DEBUG oslo_concurrency.lockutils [req-9ca399f7-c009-4fff-a021-772917296ca7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:54:08.083 143787 DEBUG oslo_service.periodic_task [req-0a47095f-c528-4f73-b77e-75d1183ea627 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:54:08.088 143787 DEBUG oslo_concurrency.lockutils [req-c9abcbcb-4fd5-408a-88fc-060c5b2a6c28 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:54:08.089 143787 DEBUG oslo_concurrency.lockutils [req-c9abcbcb-4fd5-408a-88fc-060c5b2a6c28 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:54:23.052 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:54:23.052 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:54:23.052 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:54:23.055 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:54:23.056 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:54:23.056 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:54:23.070 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:54:23.070 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:54:23.070 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:54:23.099 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:54:23.099 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:54:23.099 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:54:27.074 143781 DEBUG oslo_service.periodic_task [req-b846e796-faa5-41f3-b2a7-b2c7f71cd65c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:54:27.078 143781 DEBUG oslo_concurrency.lockutils [req-e7e2656d-60de-42f3-bfd7-228f3738d283 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:54:27.078 143781 DEBUG oslo_concurrency.lockutils [req-e7e2656d-60de-42f3-bfd7-228f3738d283 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:54:30.141 143779 DEBUG oslo_service.periodic_task [req-43b25a8b-f55f-4005-8d49-a4dae431e371 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:54:30.146 143779 DEBUG oslo_concurrency.lockutils [req-3c005ca3-9c45-45df-96c6-f9bc27d4200f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:54:30.147 143779 DEBUG oslo_concurrency.lockutils [req-3c005ca3-9c45-45df-96c6-f9bc27d4200f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:54:33.044 143780 DEBUG oslo_service.periodic_task [req-9ca399f7-c009-4fff-a021-772917296ca7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:54:33.048 143780 DEBUG oslo_concurrency.lockutils [req-c57155a4-1968-4eb2-bfc6-2e65e5e2b23a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:54:33.049 143780 DEBUG oslo_concurrency.lockutils [req-c57155a4-1968-4eb2-bfc6-2e65e5e2b23a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:54:39.083 143787 DEBUG oslo_service.periodic_task [req-c9abcbcb-4fd5-408a-88fc-060c5b2a6c28 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:54:39.088 143787 DEBUG oslo_concurrency.lockutils [req-69c3d4a2-0760-4814-b7dd-8abe41642790 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:54:39.088 143787 DEBUG oslo_concurrency.lockutils [req-69c3d4a2-0760-4814-b7dd-8abe41642790 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:54:57.085 143781 DEBUG oslo_service.periodic_task [req-e7e2656d-60de-42f3-bfd7-228f3738d283 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:54:57.089 143781 DEBUG oslo_concurrency.lockutils [req-99707123-ac16-499e-97ec-c6d22261905a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:54:57.089 143781 DEBUG oslo_concurrency.lockutils [req-99707123-ac16-499e-97ec-c6d22261905a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:55:00.157 143779 DEBUG oslo_service.periodic_task [req-3c005ca3-9c45-45df-96c6-f9bc27d4200f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:55:00.161 143779 DEBUG oslo_concurrency.lockutils [req-e1581e5c-903d-4d62-8180-68ff14c0e335 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:55:00.161 143779 DEBUG oslo_concurrency.lockutils [req-e1581e5c-903d-4d62-8180-68ff14c0e335 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:55:03.063 143780 DEBUG oslo_service.periodic_task [req-c57155a4-1968-4eb2-bfc6-2e65e5e2b23a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:55:03.067 143780 DEBUG oslo_concurrency.lockutils [req-dd0807ad-c1f6-446a-9f56-13abf92c541c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:55:03.067 143780 DEBUG oslo_concurrency.lockutils [req-dd0807ad-c1f6-446a-9f56-13abf92c541c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:55:09.094 143787 DEBUG oslo_service.periodic_task [req-69c3d4a2-0760-4814-b7dd-8abe41642790 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:55:09.098 143787 DEBUG oslo_concurrency.lockutils [req-4785d94c-2adb-425c-b21e-d79fb3c96397 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:55:09.098 143787 DEBUG oslo_concurrency.lockutils [req-4785d94c-2adb-425c-b21e-d79fb3c96397 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:55:16.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 3502b338edca438e98f88dd5dc270cd0 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:55:16.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 3502b338edca438e98f88dd5dc270cd0 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:55:16.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 3502b338edca438e98f88dd5dc270cd0 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:55:16.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 3502b338edca438e98f88dd5dc270cd0 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:55:16.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:16.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:16.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3502b338edca438e98f88dd5dc270cd0 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:55:16.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3502b338edca438e98f88dd5dc270cd0 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:55:16.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:16.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:16.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:16.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3502b338edca438e98f88dd5dc270cd0 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:55:16.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3502b338edca438e98f88dd5dc270cd0 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:55:16.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:16.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:16.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:16.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:16.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:16.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:16.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:16.818 143781 DEBUG oslo_concurrency.lockutils [req-dd2241f4-1ef3-48d5-bb80-9cb8cfdc44f9 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:55:16.818 143780 DEBUG oslo_concurrency.lockutils [req-dd2241f4-1ef3-48d5-bb80-9cb8cfdc44f9 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:55:16.818 143781 DEBUG nova.scheduler.host_manager [req-dd2241f4-1ef3-48d5-bb80-9cb8cfdc44f9 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:55:16.818 143780 DEBUG nova.scheduler.host_manager [req-dd2241f4-1ef3-48d5-bb80-9cb8cfdc44f9 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:55:16.818 143781 DEBUG oslo_concurrency.lockutils [req-dd2241f4-1ef3-48d5-bb80-9cb8cfdc44f9 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:55:16.818 143780 DEBUG oslo_concurrency.lockutils [req-dd2241f4-1ef3-48d5-bb80-9cb8cfdc44f9 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:55:16.818 143779 DEBUG oslo_concurrency.lockutils [req-dd2241f4-1ef3-48d5-bb80-9cb8cfdc44f9 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:55:16.818 143787 DEBUG oslo_concurrency.lockutils [req-dd2241f4-1ef3-48d5-bb80-9cb8cfdc44f9 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:55:16.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:55:16.819 143779 DEBUG nova.scheduler.host_manager [req-dd2241f4-1ef3-48d5-bb80-9cb8cfdc44f9 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:55:16.819 143787 DEBUG nova.scheduler.host_manager [req-dd2241f4-1ef3-48d5-bb80-9cb8cfdc44f9 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:55:16.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:16.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:16.819 143779 DEBUG oslo_concurrency.lockutils [req-dd2241f4-1ef3-48d5-bb80-9cb8cfdc44f9 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:55:16.819 143787 DEBUG oslo_concurrency.lockutils [req-dd2241f4-1ef3-48d5-bb80-9cb8cfdc44f9 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:55:16.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:55:16.819 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:55:16.819 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:16.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:16.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:16.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:16.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:55:16.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:16.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:17.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:55:17.821 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:17.821 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:17.821 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:55:17.821 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:17.821 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:55:17.821 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:17.821 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:55:17.821 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:17.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:17.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:17.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:19.823 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:55:19.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:55:19.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:55:19.823 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:55:19.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:19.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:19.824 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:19.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:19.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:19.824 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:19.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:19.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:23.824 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:55:23.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:23.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:23.825 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:55:23.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:55:23.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:55:23.825 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:23.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:23.825 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:23.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:23.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:23.826 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:27.097 143781 DEBUG oslo_service.periodic_task [req-99707123-ac16-499e-97ec-c6d22261905a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:55:27.101 143781 DEBUG oslo_concurrency.lockutils [req-2a0475fd-2424-4daa-8ceb-956ea98fb739 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:55:27.101 143781 DEBUG oslo_concurrency.lockutils [req-2a0475fd-2424-4daa-8ceb-956ea98fb739 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:55:31.057 143779 DEBUG oslo_service.periodic_task [req-e1581e5c-903d-4d62-8180-68ff14c0e335 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:55:31.061 143779 DEBUG oslo_concurrency.lockutils [req-56f7bbe9-172b-4bd6-9b56-1ff7a098380f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:55:31.062 143779 DEBUG oslo_concurrency.lockutils [req-56f7bbe9-172b-4bd6-9b56-1ff7a098380f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:55:31.827 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:55:31.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:31.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:31.831 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:55:31.831 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:31.831 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:31.833 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:55:31.834 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:31.834 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:31.834 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:55:31.834 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:31.834 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:34.044 143780 DEBUG oslo_service.periodic_task [req-dd0807ad-c1f6-446a-9f56-13abf92c541c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:55:34.049 143780 DEBUG oslo_concurrency.lockutils [req-9f677e5b-1210-4bcb-8f0a-17db2e4d6bdc - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:55:34.049 143780 DEBUG oslo_concurrency.lockutils [req-9f677e5b-1210-4bcb-8f0a-17db2e4d6bdc - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:55:39.107 143787 DEBUG oslo_service.periodic_task [req-4785d94c-2adb-425c-b21e-d79fb3c96397 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:55:39.112 143787 DEBUG oslo_concurrency.lockutils [req-99dfc5de-0cc3-4ac9-9601-ec6670ee0b8a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:55:39.112 143787 DEBUG oslo_concurrency.lockutils [req-99dfc5de-0cc3-4ac9-9601-ec6670ee0b8a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:55:47.829 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:55:47.830 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:47.830 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:47.833 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:55:47.834 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:47.834 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:47.836 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:55:47.836 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:47.836 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:55:47.836 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:47.836 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:55:47.836 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:55:57.109 143781 DEBUG oslo_service.periodic_task [req-2a0475fd-2424-4daa-8ceb-956ea98fb739 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:55:57.113 143781 DEBUG oslo_concurrency.lockutils [req-f496fd39-2014-4ed3-98f4-12d40d3da88c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:55:57.113 143781 DEBUG oslo_concurrency.lockutils [req-f496fd39-2014-4ed3-98f4-12d40d3da88c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:56:02.057 143779 DEBUG oslo_service.periodic_task [req-56f7bbe9-172b-4bd6-9b56-1ff7a098380f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:56:02.061 143779 DEBUG oslo_concurrency.lockutils [req-1a3ac3c0-1f6c-455f-af1e-6ef6878f7614 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:56:02.062 143779 DEBUG oslo_concurrency.lockutils [req-1a3ac3c0-1f6c-455f-af1e-6ef6878f7614 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:56:05.045 143780 DEBUG oslo_service.periodic_task [req-9f677e5b-1210-4bcb-8f0a-17db2e4d6bdc - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:56:05.050 143780 DEBUG oslo_concurrency.lockutils [req-dd3373d9-257d-4efa-a775-3126afd7f865 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:56:05.050 143780 DEBUG oslo_concurrency.lockutils [req-dd3373d9-257d-4efa-a775-3126afd7f865 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:56:10.083 143787 DEBUG oslo_service.periodic_task [req-99dfc5de-0cc3-4ac9-9601-ec6670ee0b8a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:56:10.087 143787 DEBUG oslo_concurrency.lockutils [req-61f8ddd2-91e9-4773-ac82-23d0aefbe45d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:56:10.087 143787 DEBUG oslo_concurrency.lockutils [req-61f8ddd2-91e9-4773-ac82-23d0aefbe45d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:56:23.053 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:56:23.053 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:56:23.054 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:56:23.056 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:56:23.056 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:56:23.056 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:56:23.070 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:56:23.070 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:56:23.070 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:56:23.102 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:56:23.102 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:56:23.102 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:56:28.042 143781 DEBUG oslo_service.periodic_task [req-f496fd39-2014-4ed3-98f4-12d40d3da88c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:56:28.046 143781 DEBUG oslo_concurrency.lockutils [req-9439cf0b-deaf-452e-a2ca-e97dd83c103f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:56:28.046 143781 DEBUG oslo_concurrency.lockutils [req-9439cf0b-deaf-452e-a2ca-e97dd83c103f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:56:32.075 143779 DEBUG oslo_service.periodic_task [req-1a3ac3c0-1f6c-455f-af1e-6ef6878f7614 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:56:32.079 143779 DEBUG oslo_concurrency.lockutils [req-9b63af90-d0a9-45a4-9527-0af7afdc9059 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:56:32.079 143779 DEBUG oslo_concurrency.lockutils [req-9b63af90-d0a9-45a4-9527-0af7afdc9059 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:56:35.054 143780 DEBUG oslo_service.periodic_task [req-dd3373d9-257d-4efa-a775-3126afd7f865 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:56:35.058 143780 DEBUG oslo_concurrency.lockutils [req-9289bce7-9677-4594-8a18-d084e33a2a06 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:56:35.058 143780 DEBUG oslo_concurrency.lockutils [req-9289bce7-9677-4594-8a18-d084e33a2a06 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:56:41.084 143787 DEBUG oslo_service.periodic_task [req-61f8ddd2-91e9-4773-ac82-23d0aefbe45d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:56:41.088 143787 DEBUG oslo_concurrency.lockutils [req-cd74bbe5-2502-466b-b9a7-4c4f8567de50 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:56:41.088 143787 DEBUG oslo_concurrency.lockutils [req-cd74bbe5-2502-466b-b9a7-4c4f8567de50 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:56:58.053 143781 DEBUG oslo_service.periodic_task [req-9439cf0b-deaf-452e-a2ca-e97dd83c103f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:56:58.058 143781 DEBUG oslo_concurrency.lockutils [req-cecf70cf-5441-48ce-ba31-7e5d6a18f2d6 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:56:58.058 143781 DEBUG oslo_concurrency.lockutils [req-cecf70cf-5441-48ce-ba31-7e5d6a18f2d6 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:57:03.057 143779 DEBUG oslo_service.periodic_task [req-9b63af90-d0a9-45a4-9527-0af7afdc9059 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:57:03.062 143779 DEBUG oslo_concurrency.lockutils [req-4522c0de-9269-4cb9-b603-f7893dda8876 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:57:03.062 143779 DEBUG oslo_concurrency.lockutils [req-4522c0de-9269-4cb9-b603-f7893dda8876 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:57:05.065 143780 DEBUG oslo_service.periodic_task [req-9289bce7-9677-4594-8a18-d084e33a2a06 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:57:05.069 143780 DEBUG oslo_concurrency.lockutils [req-52285bae-6b03-499c-aa1f-a90679e2c683 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:57:05.069 143780 DEBUG oslo_concurrency.lockutils [req-52285bae-6b03-499c-aa1f-a90679e2c683 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:57:11.094 143787 DEBUG oslo_service.periodic_task [req-cd74bbe5-2502-466b-b9a7-4c4f8567de50 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:57:11.098 143787 DEBUG oslo_concurrency.lockutils [req-f1a641c5-a557-433d-ab4a-6a8700ddb37d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:57:11.098 143787 DEBUG oslo_concurrency.lockutils [req-f1a641c5-a557-433d-ab4a-6a8700ddb37d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:57:19.810 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4c7586462e6e44b6ae7beb5915c97568 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:57:19.810 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4c7586462e6e44b6ae7beb5915c97568 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:57:19.810 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4c7586462e6e44b6ae7beb5915c97568 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:57:19.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4c7586462e6e44b6ae7beb5915c97568 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:57:19.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:19.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:19.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:19.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:19.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4c7586462e6e44b6ae7beb5915c97568 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:57:19.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4c7586462e6e44b6ae7beb5915c97568 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:57:19.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4c7586462e6e44b6ae7beb5915c97568 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:57:19.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4c7586462e6e44b6ae7beb5915c97568 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:57:19.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:19.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:19.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:19.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:19.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:19.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:19.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:19.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:19.812 143779 DEBUG oslo_concurrency.lockutils [req-724c0520-37f7-454f-8dca-f5e51da514cd - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:57:19.812 143787 DEBUG oslo_concurrency.lockutils [req-724c0520-37f7-454f-8dca-f5e51da514cd - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:57:19.812 143780 DEBUG oslo_concurrency.lockutils [req-724c0520-37f7-454f-8dca-f5e51da514cd - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:57:19.812 143781 DEBUG oslo_concurrency.lockutils [req-724c0520-37f7-454f-8dca-f5e51da514cd - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:57:19.812 143779 DEBUG nova.scheduler.host_manager [req-724c0520-37f7-454f-8dca-f5e51da514cd - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:57:19.812 143787 DEBUG nova.scheduler.host_manager [req-724c0520-37f7-454f-8dca-f5e51da514cd - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:57:19.812 143780 DEBUG nova.scheduler.host_manager [req-724c0520-37f7-454f-8dca-f5e51da514cd - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:57:19.812 143779 DEBUG oslo_concurrency.lockutils [req-724c0520-37f7-454f-8dca-f5e51da514cd - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:57:19.812 143787 DEBUG oslo_concurrency.lockutils [req-724c0520-37f7-454f-8dca-f5e51da514cd - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:57:19.812 143781 DEBUG nova.scheduler.host_manager [req-724c0520-37f7-454f-8dca-f5e51da514cd - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:57:19.812 143780 DEBUG oslo_concurrency.lockutils [req-724c0520-37f7-454f-8dca-f5e51da514cd - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:57:19.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:57:19.813 143781 DEBUG oslo_concurrency.lockutils [req-724c0520-37f7-454f-8dca-f5e51da514cd - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:57:19.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:57:19.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:19.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:57:19.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:19.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:19.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:19.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:19.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:19.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:57:19.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:19.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:20.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:57:20.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:20.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:57:20.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:57:20.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:57:20.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:20.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:20.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:20.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:20.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:20.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:20.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:22.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:57:22.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:22.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:22.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:57:22.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:57:22.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:57:22.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:22.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:22.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:22.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:22.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:22.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:26.821 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:57:26.821 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:26.821 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:26.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:57:26.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:57:26.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:26.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:26.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:26.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:26.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:57:26.823 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:26.823 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:28.063 143781 DEBUG oslo_service.periodic_task [req-cecf70cf-5441-48ce-ba31-7e5d6a18f2d6 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:57:28.067 143781 DEBUG oslo_concurrency.lockutils [req-596486d0-d96f-4822-bbda-e0a39d7bab0c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:57:28.067 143781 DEBUG oslo_concurrency.lockutils [req-596486d0-d96f-4822-bbda-e0a39d7bab0c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:57:33.073 143779 DEBUG oslo_service.periodic_task [req-4522c0de-9269-4cb9-b603-f7893dda8876 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:57:33.078 143779 DEBUG oslo_concurrency.lockutils [req-7f16c640-40a2-4092-afbd-74d37ff4837f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:57:33.078 143779 DEBUG oslo_concurrency.lockutils [req-7f16c640-40a2-4092-afbd-74d37ff4837f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:57:34.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:57:34.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:34.823 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:34.824 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:57:34.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:34.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:34.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:57:34.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:34.825 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:57:34.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:34.825 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:34.825 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:35.074 143780 DEBUG oslo_service.periodic_task [req-52285bae-6b03-499c-aa1f-a90679e2c683 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:57:35.077 143780 DEBUG oslo_concurrency.lockutils [req-306068e3-bc4b-48ba-918a-a83a5d0c15e3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:57:35.078 143780 DEBUG oslo_concurrency.lockutils [req-306068e3-bc4b-48ba-918a-a83a5d0c15e3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:57:41.107 143787 DEBUG oslo_service.periodic_task [req-f1a641c5-a557-433d-ab4a-6a8700ddb37d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:57:41.111 143787 DEBUG oslo_concurrency.lockutils [req-229aac6d-3f68-451f-bad0-ec5a414927a5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:57:41.111 143787 DEBUG oslo_concurrency.lockutils [req-229aac6d-3f68-451f-bad0-ec5a414927a5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:57:50.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:57:50.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:50.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:50.826 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:57:50.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:57:50.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:50.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:50.827 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:57:50.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:50.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:50.827 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:57:50.827 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:57:58.073 143781 DEBUG oslo_service.periodic_task [req-596486d0-d96f-4822-bbda-e0a39d7bab0c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:57:58.078 143781 DEBUG oslo_concurrency.lockutils [req-d30e98f2-615c-47e2-9382-138e08222047 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:57:58.078 143781 DEBUG oslo_concurrency.lockutils [req-d30e98f2-615c-47e2-9382-138e08222047 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:58:03.093 143779 DEBUG oslo_service.periodic_task [req-7f16c640-40a2-4092-afbd-74d37ff4837f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:58:03.097 143779 DEBUG oslo_concurrency.lockutils [req-e0303f2e-cbf7-4e3e-90b3-cb1fc4cf7493 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:58:03.098 143779 DEBUG oslo_concurrency.lockutils [req-e0303f2e-cbf7-4e3e-90b3-cb1fc4cf7493 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:58:05.083 143780 DEBUG oslo_service.periodic_task [req-306068e3-bc4b-48ba-918a-a83a5d0c15e3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:58:05.087 143780 DEBUG oslo_concurrency.lockutils [req-3dfab0ec-9f72-44b8-aa4d-d74743c1ff3f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:58:05.087 143780 DEBUG oslo_concurrency.lockutils [req-3dfab0ec-9f72-44b8-aa4d-d74743c1ff3f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:58:12.083 143787 DEBUG oslo_service.periodic_task [req-229aac6d-3f68-451f-bad0-ec5a414927a5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:58:12.088 143787 DEBUG oslo_concurrency.lockutils [req-d7e9d420-016d-4ec5-a696-f2a361da5eff - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:58:12.088 143787 DEBUG oslo_concurrency.lockutils [req-d7e9d420-016d-4ec5-a696-f2a361da5eff - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:58:23.056 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:58:23.056 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:58:23.056 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:58:23.057 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:58:23.058 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:58:23.058 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:58:23.073 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:58:23.074 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:58:23.074 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:58:23.103 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:58:23.103 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:58:23.104 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:58:28.087 143781 DEBUG oslo_service.periodic_task [req-d30e98f2-615c-47e2-9382-138e08222047 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:58:28.091 143781 DEBUG oslo_concurrency.lockutils [req-9f1615ee-d855-4a2e-853d-6f34457d76b7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:58:28.092 143781 DEBUG oslo_concurrency.lockutils [req-9f1615ee-d855-4a2e-853d-6f34457d76b7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:58:33.111 143779 DEBUG oslo_service.periodic_task [req-e0303f2e-cbf7-4e3e-90b3-cb1fc4cf7493 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:58:33.118 143779 DEBUG oslo_concurrency.lockutils [req-c04a639f-9551-4534-8dbd-5b9e4ef21923 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:58:33.118 143779 DEBUG oslo_concurrency.lockutils [req-c04a639f-9551-4534-8dbd-5b9e4ef21923 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:58:35.093 143780 DEBUG oslo_service.periodic_task [req-3dfab0ec-9f72-44b8-aa4d-d74743c1ff3f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:58:35.096 143780 DEBUG oslo_concurrency.lockutils [req-a6348e2f-759e-4c5c-a133-56c274fe44cb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:58:35.097 143780 DEBUG oslo_concurrency.lockutils [req-a6348e2f-759e-4c5c-a133-56c274fe44cb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:58:42.095 143787 DEBUG oslo_service.periodic_task [req-d7e9d420-016d-4ec5-a696-f2a361da5eff - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:58:42.099 143787 DEBUG oslo_concurrency.lockutils [req-00ab98bf-d798-4929-aa7c-d74311b5490d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:58:42.099 143787 DEBUG oslo_concurrency.lockutils [req-00ab98bf-d798-4929-aa7c-d74311b5490d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:58:58.100 143781 DEBUG oslo_service.periodic_task [req-9f1615ee-d855-4a2e-853d-6f34457d76b7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:58:58.104 143781 DEBUG oslo_concurrency.lockutils [req-9c645144-5113-402a-8644-d572046c32af - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:58:58.104 143781 DEBUG oslo_concurrency.lockutils [req-9c645144-5113-402a-8644-d572046c32af - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:59:04.057 143779 DEBUG oslo_service.periodic_task [req-c04a639f-9551-4534-8dbd-5b9e4ef21923 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:59:04.061 143779 DEBUG oslo_concurrency.lockutils [req-5914248c-0654-4320-89d9-621639200dfc - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:59:04.062 143779 DEBUG oslo_concurrency.lockutils [req-5914248c-0654-4320-89d9-621639200dfc - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:59:05.101 143780 DEBUG oslo_service.periodic_task [req-a6348e2f-759e-4c5c-a133-56c274fe44cb - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:59:05.104 143780 DEBUG oslo_concurrency.lockutils [req-7e600e61-6e67-47c9-b418-6851c2bb6a3b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:59:05.105 143780 DEBUG oslo_concurrency.lockutils [req-7e600e61-6e67-47c9-b418-6851c2bb6a3b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:59:13.083 143787 DEBUG oslo_service.periodic_task [req-00ab98bf-d798-4929-aa7c-d74311b5490d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:59:13.087 143787 DEBUG oslo_concurrency.lockutils [req-c8ee470f-0cde-4373-a61c-e7c18591e254 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:59:13.088 143787 DEBUG oslo_concurrency.lockutils [req-c8ee470f-0cde-4373-a61c-e7c18591e254 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:59:20.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1f8bc699a72440f1ab934e51d388b074 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:59:20.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1f8bc699a72440f1ab934e51d388b074 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:59:20.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1f8bc699a72440f1ab934e51d388b074 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:59:20.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1f8bc699a72440f1ab934e51d388b074 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 00:59:20.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:20.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:20.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:20.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:20.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1f8bc699a72440f1ab934e51d388b074 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:59:20.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1f8bc699a72440f1ab934e51d388b074 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:59:20.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1f8bc699a72440f1ab934e51d388b074 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:59:20.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1f8bc699a72440f1ab934e51d388b074 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 00:59:20.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:20.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:20.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:20.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:20.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:20.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:20.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:20.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:20.814 143780 DEBUG oslo_concurrency.lockutils [req-0653be7a-4f42-4961-b0b3-98f259e8fc40 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:59:20.814 143781 DEBUG oslo_concurrency.lockutils [req-0653be7a-4f42-4961-b0b3-98f259e8fc40 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:59:20.814 143779 DEBUG oslo_concurrency.lockutils [req-0653be7a-4f42-4961-b0b3-98f259e8fc40 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:59:20.814 143787 DEBUG oslo_concurrency.lockutils [req-0653be7a-4f42-4961-b0b3-98f259e8fc40 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:59:20.814 143780 DEBUG nova.scheduler.host_manager [req-0653be7a-4f42-4961-b0b3-98f259e8fc40 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:59:20.814 143781 DEBUG nova.scheduler.host_manager [req-0653be7a-4f42-4961-b0b3-98f259e8fc40 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:59:20.814 143779 DEBUG nova.scheduler.host_manager [req-0653be7a-4f42-4961-b0b3-98f259e8fc40 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:59:20.814 143787 DEBUG nova.scheduler.host_manager [req-0653be7a-4f42-4961-b0b3-98f259e8fc40 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 00:59:20.814 143780 DEBUG oslo_concurrency.lockutils [req-0653be7a-4f42-4961-b0b3-98f259e8fc40 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:59:20.814 143781 DEBUG oslo_concurrency.lockutils [req-0653be7a-4f42-4961-b0b3-98f259e8fc40 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:59:20.814 143779 DEBUG oslo_concurrency.lockutils [req-0653be7a-4f42-4961-b0b3-98f259e8fc40 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:59:20.814 143787 DEBUG oslo_concurrency.lockutils [req-0653be7a-4f42-4961-b0b3-98f259e8fc40 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:59:20.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:59:20.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:20.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:59:20.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:20.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:20.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:20.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:59:20.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:59:20.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:20.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:20.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:20.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:21.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:59:21.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:59:21.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:21.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:21.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:21.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:21.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:59:21.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:21.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:21.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:59:21.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:21.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:23.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:59:23.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:23.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:59:23.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:23.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:23.819 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:23.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:59:23.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:59:23.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:23.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:23.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:23.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:27.821 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:59:27.821 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:59:27.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:27.821 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:59:27.821 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:59:27.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:27.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:27.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:27.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:27.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:27.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:27.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:29.042 143781 DEBUG oslo_service.periodic_task [req-9c645144-5113-402a-8644-d572046c32af - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:59:29.045 143781 DEBUG oslo_concurrency.lockutils [req-87a65108-ad87-4dd1-974f-4479c6fe4471 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:59:29.046 143781 DEBUG oslo_concurrency.lockutils [req-87a65108-ad87-4dd1-974f-4479c6fe4471 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:59:35.058 143779 DEBUG oslo_service.periodic_task [req-5914248c-0654-4320-89d9-621639200dfc - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:59:35.061 143779 DEBUG oslo_concurrency.lockutils [req-4a9162f0-262a-4e81-9ca7-53d2c195d065 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:59:35.062 143779 DEBUG oslo_concurrency.lockutils [req-4a9162f0-262a-4e81-9ca7-53d2c195d065 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:59:35.110 143780 DEBUG oslo_service.periodic_task [req-7e600e61-6e67-47c9-b418-6851c2bb6a3b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:59:35.114 143780 DEBUG oslo_concurrency.lockutils [req-e55eefd9-d8fd-4eed-a710-9106ee3e9006 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:59:35.114 143780 DEBUG oslo_concurrency.lockutils [req-e55eefd9-d8fd-4eed-a710-9106ee3e9006 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:59:35.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:59:35.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:35.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:35.823 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:59:35.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:59:35.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:35.823 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:35.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:35.823 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:35.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:59:35.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:35.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:44.082 143787 DEBUG oslo_service.periodic_task [req-c8ee470f-0cde-4373-a61c-e7c18591e254 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 00:59:44.086 143787 DEBUG oslo_concurrency.lockutils [req-aa1177a6-521f-4262-913e-c28dec786c5d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 00:59:44.086 143787 DEBUG oslo_concurrency.lockutils [req-aa1177a6-521f-4262-913e-c28dec786c5d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 00:59:51.825 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:59:51.825 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:51.825 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:51.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:59:51.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:59:51.826 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:51.826 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:51.826 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:51.826 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 00:59:51.827 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 00:59:51.828 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 00:59:51.828 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:00:00.042 143781 DEBUG oslo_service.periodic_task [req-87a65108-ad87-4dd1-974f-4479c6fe4471 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:00:00.045 143781 DEBUG oslo_concurrency.lockutils [req-92e6e0ea-181e-41f7-9111-0e73ee1f61b8 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:00:00.046 143781 DEBUG oslo_concurrency.lockutils [req-92e6e0ea-181e-41f7-9111-0e73ee1f61b8 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:00:05.119 143780 DEBUG oslo_service.periodic_task [req-e55eefd9-d8fd-4eed-a710-9106ee3e9006 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:00:05.123 143780 DEBUG oslo_concurrency.lockutils [req-ec8de425-c3ab-4498-ac49-70ffaa24375b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:00:05.123 143780 DEBUG oslo_concurrency.lockutils [req-ec8de425-c3ab-4498-ac49-70ffaa24375b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:00:06.057 143779 DEBUG oslo_service.periodic_task [req-4a9162f0-262a-4e81-9ca7-53d2c195d065 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:00:06.061 143779 DEBUG oslo_concurrency.lockutils [req-9728dc4e-af78-402d-be77-2af223cfd108 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:00:06.061 143779 DEBUG oslo_concurrency.lockutils [req-9728dc4e-af78-402d-be77-2af223cfd108 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:00:14.090 143787 DEBUG oslo_service.periodic_task [req-aa1177a6-521f-4262-913e-c28dec786c5d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:00:14.094 143787 DEBUG oslo_concurrency.lockutils [req-f50562e7-7b7e-4ef7-82ce-85c4722e6ac2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:00:14.094 143787 DEBUG oslo_concurrency.lockutils [req-f50562e7-7b7e-4ef7-82ce-85c4722e6ac2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:00:23.827 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:00:23.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:00:23.827 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:00:23.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:00:23.827 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:00:23.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:00:23.828 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:00:23.828 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:00:23.828 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:00:23.829 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:00:23.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:00:23.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:00:30.056 143781 DEBUG oslo_service.periodic_task [req-92e6e0ea-181e-41f7-9111-0e73ee1f61b8 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:00:30.060 143781 DEBUG oslo_concurrency.lockutils [req-de40cefb-c68c-400a-b145-27f3ee9279ec - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:00:30.060 143781 DEBUG oslo_concurrency.lockutils [req-de40cefb-c68c-400a-b145-27f3ee9279ec - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:00:36.044 143780 DEBUG oslo_service.periodic_task [req-ec8de425-c3ab-4498-ac49-70ffaa24375b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:00:36.048 143780 DEBUG oslo_concurrency.lockutils [req-cad006e2-66b5-47e1-b496-e5d6fc615d45 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:00:36.048 143780 DEBUG oslo_concurrency.lockutils [req-cad006e2-66b5-47e1-b496-e5d6fc615d45 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:00:37.058 143779 DEBUG oslo_service.periodic_task [req-9728dc4e-af78-402d-be77-2af223cfd108 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:00:37.062 143779 DEBUG oslo_concurrency.lockutils [req-6dab42df-2d89-4b81-b474-750d969f2655 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:00:37.062 143779 DEBUG oslo_concurrency.lockutils [req-6dab42df-2d89-4b81-b474-750d969f2655 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:00:45.083 143787 DEBUG oslo_service.periodic_task [req-f50562e7-7b7e-4ef7-82ce-85c4722e6ac2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:00:45.087 143787 DEBUG oslo_concurrency.lockutils [req-31e611e9-1a76-4ac3-bf10-e0c12b99ee2c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:00:45.087 143787 DEBUG oslo_concurrency.lockutils [req-31e611e9-1a76-4ac3-bf10-e0c12b99ee2c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:01:00.069 143781 DEBUG oslo_service.periodic_task [req-de40cefb-c68c-400a-b145-27f3ee9279ec - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:01:00.073 143781 DEBUG oslo_concurrency.lockutils [req-e9907e54-8ed5-4409-ac0f-e0ea866b65aa - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:01:00.073 143781 DEBUG oslo_concurrency.lockutils [req-e9907e54-8ed5-4409-ac0f-e0ea866b65aa - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:01:06.054 143780 DEBUG oslo_service.periodic_task [req-cad006e2-66b5-47e1-b496-e5d6fc615d45 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:01:06.059 143780 DEBUG oslo_concurrency.lockutils [req-adf6561c-125e-4d1a-afb4-2eb6346c2b89 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:01:06.059 143780 DEBUG oslo_concurrency.lockutils [req-adf6561c-125e-4d1a-afb4-2eb6346c2b89 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:01:07.069 143779 DEBUG oslo_service.periodic_task [req-6dab42df-2d89-4b81-b474-750d969f2655 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:01:07.073 143779 DEBUG oslo_concurrency.lockutils [req-4f85c68d-e60b-488b-91a4-2506cd9ddef5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:01:07.073 143779 DEBUG oslo_concurrency.lockutils [req-4f85c68d-e60b-488b-91a4-2506cd9ddef5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:01:15.093 143787 DEBUG oslo_service.periodic_task [req-31e611e9-1a76-4ac3-bf10-e0c12b99ee2c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:01:15.097 143787 DEBUG oslo_concurrency.lockutils [req-b07f6c9e-303b-4436-bdaa-3af4a9dfb572 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:01:15.098 143787 DEBUG oslo_concurrency.lockutils [req-b07f6c9e-303b-4436-bdaa-3af4a9dfb572 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:01:24.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: aa0f9133ea7747faa8ce61b74671239a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:01:24.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: aa0f9133ea7747faa8ce61b74671239a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:01:24.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: aa0f9133ea7747faa8ce61b74671239a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:01:24.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:24.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:24.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:24.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: aa0f9133ea7747faa8ce61b74671239a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:01:24.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: aa0f9133ea7747faa8ce61b74671239a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:01:24.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: aa0f9133ea7747faa8ce61b74671239a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:01:24.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: aa0f9133ea7747faa8ce61b74671239a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:01:24.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:24.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:24.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:24.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:24.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:24.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:24.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:24.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: aa0f9133ea7747faa8ce61b74671239a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:01:24.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:24.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:24.813 143781 DEBUG oslo_concurrency.lockutils [req-9dffa84e-6a7d-4ce5-8bb3-cb1f4446ef3c - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:01:24.813 143780 DEBUG oslo_concurrency.lockutils [req-9dffa84e-6a7d-4ce5-8bb3-cb1f4446ef3c - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:01:24.813 143787 DEBUG oslo_concurrency.lockutils [req-9dffa84e-6a7d-4ce5-8bb3-cb1f4446ef3c - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:01:24.814 143781 DEBUG nova.scheduler.host_manager [req-9dffa84e-6a7d-4ce5-8bb3-cb1f4446ef3c - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:01:24.814 143780 DEBUG nova.scheduler.host_manager [req-9dffa84e-6a7d-4ce5-8bb3-cb1f4446ef3c - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:01:24.814 143787 DEBUG nova.scheduler.host_manager [req-9dffa84e-6a7d-4ce5-8bb3-cb1f4446ef3c - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:01:24.814 143781 DEBUG oslo_concurrency.lockutils [req-9dffa84e-6a7d-4ce5-8bb3-cb1f4446ef3c - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:01:24.814 143787 DEBUG oslo_concurrency.lockutils [req-9dffa84e-6a7d-4ce5-8bb3-cb1f4446ef3c - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:01:24.814 143780 DEBUG oslo_concurrency.lockutils [req-9dffa84e-6a7d-4ce5-8bb3-cb1f4446ef3c - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:01:24.814 143779 DEBUG oslo_concurrency.lockutils [req-9dffa84e-6a7d-4ce5-8bb3-cb1f4446ef3c - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:01:24.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:01:24.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:01:24.814 143779 DEBUG nova.scheduler.host_manager [req-9dffa84e-6a7d-4ce5-8bb3-cb1f4446ef3c - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:01:24.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:24.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:24.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:24.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:24.814 143779 DEBUG oslo_concurrency.lockutils [req-9dffa84e-6a7d-4ce5-8bb3-cb1f4446ef3c - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:01:24.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:01:24.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:24.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:24.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:01:24.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:24.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:25.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:01:25.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:25.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:25.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:01:25.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:25.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:25.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:01:25.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:25.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:25.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:01:25.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:25.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:27.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:01:27.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:27.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:27.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:01:27.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:27.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:27.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:01:27.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:27.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:27.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:01:27.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:27.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:30.079 143781 DEBUG oslo_service.periodic_task [req-e9907e54-8ed5-4409-ac0f-e0ea866b65aa - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:01:30.083 143781 DEBUG oslo_concurrency.lockutils [req-c661fdf8-7cd6-4356-8506-a1e8fc234e25 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:01:30.083 143781 DEBUG oslo_concurrency.lockutils [req-c661fdf8-7cd6-4356-8506-a1e8fc234e25 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:01:31.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:01:31.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:01:31.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:31.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:31.823 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:31.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:01:31.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:31.823 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:31.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:31.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:01:31.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:31.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:37.045 143780 DEBUG oslo_service.periodic_task [req-adf6561c-125e-4d1a-afb4-2eb6346c2b89 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:01:37.048 143780 DEBUG oslo_concurrency.lockutils [req-8b8fb8b6-0aa5-40bf-927b-069c1579a058 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:01:37.049 143780 DEBUG oslo_concurrency.lockutils [req-8b8fb8b6-0aa5-40bf-927b-069c1579a058 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:01:37.079 143779 DEBUG oslo_service.periodic_task [req-4f85c68d-e60b-488b-91a4-2506cd9ddef5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:01:37.083 143779 DEBUG oslo_concurrency.lockutils [req-d6dc66e7-6e99-4f31-bff5-240ba6d5d6ca - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:01:37.084 143779 DEBUG oslo_concurrency.lockutils [req-d6dc66e7-6e99-4f31-bff5-240ba6d5d6ca - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:01:39.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:01:39.825 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:39.825 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:39.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:01:39.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:39.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:39.828 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:01:39.828 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:39.828 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:39.828 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:01:39.828 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:39.828 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:46.083 143787 DEBUG oslo_service.periodic_task [req-b07f6c9e-303b-4436-bdaa-3af4a9dfb572 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:01:46.087 143787 DEBUG oslo_concurrency.lockutils [req-fad3f1c1-698a-4932-9be6-7c7f394c8579 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:01:46.087 143787 DEBUG oslo_concurrency.lockutils [req-fad3f1c1-698a-4932-9be6-7c7f394c8579 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:01:55.827 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:01:55.827 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:55.827 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:55.829 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:01:55.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:01:55.831 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:55.831 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:55.831 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:55.831 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:01:55.832 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:01:55.832 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:01:55.832 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:02:01.041 143781 DEBUG oslo_service.periodic_task [req-c661fdf8-7cd6-4356-8506-a1e8fc234e25 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:02:01.045 143781 DEBUG oslo_concurrency.lockutils [req-2dfe5875-69ea-4712-94b7-8319bec9d660 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:02:01.046 143781 DEBUG oslo_concurrency.lockutils [req-2dfe5875-69ea-4712-94b7-8319bec9d660 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:02:07.090 143779 DEBUG oslo_service.periodic_task [req-d6dc66e7-6e99-4f31-bff5-240ba6d5d6ca - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:02:07.094 143779 DEBUG oslo_concurrency.lockutils [req-b582000f-ab23-4e74-a5f2-6e63d26c6bf0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:02:07.094 143779 DEBUG oslo_concurrency.lockutils [req-b582000f-ab23-4e74-a5f2-6e63d26c6bf0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:02:08.044 143780 DEBUG oslo_service.periodic_task [req-8b8fb8b6-0aa5-40bf-927b-069c1579a058 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:02:08.047 143780 DEBUG oslo_concurrency.lockutils [req-219cfa21-f964-4496-8fba-ea115011e335 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:02:08.048 143780 DEBUG oslo_concurrency.lockutils [req-219cfa21-f964-4496-8fba-ea115011e335 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:02:16.093 143787 DEBUG oslo_service.periodic_task [req-fad3f1c1-698a-4932-9be6-7c7f394c8579 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:02:16.097 143787 DEBUG oslo_concurrency.lockutils [req-a2ce0fd9-b86c-4bdc-8fa8-9de38d8e0b6a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:02:16.098 143787 DEBUG oslo_concurrency.lockutils [req-a2ce0fd9-b86c-4bdc-8fa8-9de38d8e0b6a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:02:27.832 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:02:27.832 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:02:27.832 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:02:27.834 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:02:27.834 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:02:27.835 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:02:27.836 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:02:27.836 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:02:27.837 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:02:27.837 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:02:27.837 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:02:27.837 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:02:31.052 143781 DEBUG oslo_service.periodic_task [req-2dfe5875-69ea-4712-94b7-8319bec9d660 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:02:31.056 143781 DEBUG oslo_concurrency.lockutils [req-83a45488-5b25-434c-99b6-6e8c4070642c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:02:31.056 143781 DEBUG oslo_concurrency.lockutils [req-83a45488-5b25-434c-99b6-6e8c4070642c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:02:37.101 143779 DEBUG oslo_service.periodic_task [req-b582000f-ab23-4e74-a5f2-6e63d26c6bf0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:02:37.106 143779 DEBUG oslo_concurrency.lockutils [req-fb85c016-af95-4d21-a254-1d3f4efc9b8f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:02:37.106 143779 DEBUG oslo_concurrency.lockutils [req-fb85c016-af95-4d21-a254-1d3f4efc9b8f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:02:38.053 143780 DEBUG oslo_service.periodic_task [req-219cfa21-f964-4496-8fba-ea115011e335 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:02:38.057 143780 DEBUG oslo_concurrency.lockutils [req-ea687b9f-8bc9-4650-8138-a2b2bdbd6d2a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:02:38.057 143780 DEBUG oslo_concurrency.lockutils [req-ea687b9f-8bc9-4650-8138-a2b2bdbd6d2a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:02:47.083 143787 DEBUG oslo_service.periodic_task [req-a2ce0fd9-b86c-4bdc-8fa8-9de38d8e0b6a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:02:47.087 143787 DEBUG oslo_concurrency.lockutils [req-445d183c-2ad2-4d9e-a2d3-83276fa4f3df - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:02:47.087 143787 DEBUG oslo_concurrency.lockutils [req-445d183c-2ad2-4d9e-a2d3-83276fa4f3df - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:03:01.067 143781 DEBUG oslo_service.periodic_task [req-83a45488-5b25-434c-99b6-6e8c4070642c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:03:01.071 143781 DEBUG oslo_concurrency.lockutils [req-5cbefadf-035a-4292-8054-58b18cffcfd9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:03:01.071 143781 DEBUG oslo_concurrency.lockutils [req-5cbefadf-035a-4292-8054-58b18cffcfd9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:03:08.058 143779 DEBUG oslo_service.periodic_task [req-fb85c016-af95-4d21-a254-1d3f4efc9b8f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:03:08.062 143779 DEBUG oslo_concurrency.lockutils [req-41fcb0cb-94ec-4738-85fd-1e080838da31 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:03:08.062 143779 DEBUG oslo_concurrency.lockutils [req-41fcb0cb-94ec-4738-85fd-1e080838da31 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:03:08.062 143780 DEBUG oslo_service.periodic_task [req-ea687b9f-8bc9-4650-8138-a2b2bdbd6d2a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:03:08.066 143780 DEBUG oslo_concurrency.lockutils [req-344c6ebc-3501-4948-b6b2-2b6d49893afe - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:03:08.066 143780 DEBUG oslo_concurrency.lockutils [req-344c6ebc-3501-4948-b6b2-2b6d49893afe - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:03:18.083 143787 DEBUG oslo_service.periodic_task [req-445d183c-2ad2-4d9e-a2d3-83276fa4f3df - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:03:18.087 143787 DEBUG oslo_concurrency.lockutils [req-4c316fad-e621-42b0-8fa5-67ed9d2c38be - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:03:18.088 143787 DEBUG oslo_concurrency.lockutils [req-4c316fad-e621-42b0-8fa5-67ed9d2c38be - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:03:27.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 036952bfead8450783e1c2af69e0a9bd __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:03:27.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 036952bfead8450783e1c2af69e0a9bd __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:03:27.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 036952bfead8450783e1c2af69e0a9bd __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:03:27.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 036952bfead8450783e1c2af69e0a9bd __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:03:27.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:27.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:27.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:27.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 036952bfead8450783e1c2af69e0a9bd poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:03:27.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 036952bfead8450783e1c2af69e0a9bd poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:03:27.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 036952bfead8450783e1c2af69e0a9bd poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:03:27.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:27.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:27.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:27.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:27.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:27.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 036952bfead8450783e1c2af69e0a9bd poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:03:27.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:27.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:27.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:27.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:27.813 143787 DEBUG oslo_concurrency.lockutils [req-0fb3e85f-08a8-48a7-93de-dfe4da9946aa - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:03:27.813 143780 DEBUG oslo_concurrency.lockutils [req-0fb3e85f-08a8-48a7-93de-dfe4da9946aa - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:03:27.813 143779 DEBUG oslo_concurrency.lockutils [req-0fb3e85f-08a8-48a7-93de-dfe4da9946aa - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:03:27.813 143787 DEBUG nova.scheduler.host_manager [req-0fb3e85f-08a8-48a7-93de-dfe4da9946aa - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:03:27.813 143780 DEBUG nova.scheduler.host_manager [req-0fb3e85f-08a8-48a7-93de-dfe4da9946aa - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:03:27.813 143779 DEBUG nova.scheduler.host_manager [req-0fb3e85f-08a8-48a7-93de-dfe4da9946aa - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:03:27.813 143787 DEBUG oslo_concurrency.lockutils [req-0fb3e85f-08a8-48a7-93de-dfe4da9946aa - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:03:27.813 143780 DEBUG oslo_concurrency.lockutils [req-0fb3e85f-08a8-48a7-93de-dfe4da9946aa - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:03:27.813 143779 DEBUG oslo_concurrency.lockutils [req-0fb3e85f-08a8-48a7-93de-dfe4da9946aa - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:03:27.813 143781 DEBUG oslo_concurrency.lockutils [req-0fb3e85f-08a8-48a7-93de-dfe4da9946aa - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:03:27.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:03:27.814 143781 DEBUG nova.scheduler.host_manager [req-0fb3e85f-08a8-48a7-93de-dfe4da9946aa - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:03:27.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:27.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:27.814 143781 DEBUG oslo_concurrency.lockutils [req-0fb3e85f-08a8-48a7-93de-dfe4da9946aa - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:03:27.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:03:27.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:27.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:27.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:03:27.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:03:27.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:27.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:27.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:27.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:28.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:03:28.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:03:28.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:28.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:28.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:28.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:28.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:03:28.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:03:28.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:28.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:28.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:28.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:30.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:03:30.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:03:30.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:03:30.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:30.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:30.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:30.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:30.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:30.819 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:30.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:03:30.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:30.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:31.075 143781 DEBUG oslo_service.periodic_task [req-5cbefadf-035a-4292-8054-58b18cffcfd9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:03:31.080 143781 DEBUG oslo_concurrency.lockutils [req-f4c70ae3-65eb-4e41-84bf-e56bc9080cb8 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:03:31.080 143781 DEBUG oslo_concurrency.lockutils [req-f4c70ae3-65eb-4e41-84bf-e56bc9080cb8 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:03:34.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:03:34.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:34.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:34.821 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:03:34.821 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:03:34.821 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:34.821 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:34.821 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:34.821 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:34.821 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:03:34.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:34.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:38.073 143780 DEBUG oslo_service.periodic_task [req-344c6ebc-3501-4948-b6b2-2b6d49893afe - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:03:38.077 143780 DEBUG oslo_concurrency.lockutils [req-69fcb584-5a00-410c-b093-216d5096ed7e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:03:38.077 143780 DEBUG oslo_concurrency.lockutils [req-69fcb584-5a00-410c-b093-216d5096ed7e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:03:39.057 143779 DEBUG oslo_service.periodic_task [req-41fcb0cb-94ec-4738-85fd-1e080838da31 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:03:39.061 143779 DEBUG oslo_concurrency.lockutils [req-0f11a4bb-df97-4b46-96ce-c9804bf3d874 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:03:39.061 143779 DEBUG oslo_concurrency.lockutils [req-0f11a4bb-df97-4b46-96ce-c9804bf3d874 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:03:42.826 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:03:42.826 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:42.826 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:42.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:03:42.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:42.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:42.828 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:03:42.828 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:42.828 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:42.829 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:03:42.830 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:42.830 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:48.095 143787 DEBUG oslo_service.periodic_task [req-4c316fad-e621-42b0-8fa5-67ed9d2c38be - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:03:48.098 143787 DEBUG oslo_concurrency.lockutils [req-646c8b0f-3a20-4885-8ace-451fa4f11c2b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:03:48.099 143787 DEBUG oslo_concurrency.lockutils [req-646c8b0f-3a20-4885-8ace-451fa4f11c2b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:03:58.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:03:58.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:58.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:58.829 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:03:58.830 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:58.830 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:58.830 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:03:58.831 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:03:58.831 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:58.831 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:03:58.831 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:03:58.831 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:04:01.087 143781 DEBUG oslo_service.periodic_task [req-f4c70ae3-65eb-4e41-84bf-e56bc9080cb8 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:04:01.090 143781 DEBUG oslo_concurrency.lockutils [req-6dbe3ba5-9a2c-4797-ab70-46359429fd71 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:04:01.092 143781 DEBUG oslo_concurrency.lockutils [req-6dbe3ba5-9a2c-4797-ab70-46359429fd71 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:04:09.044 143780 DEBUG oslo_service.periodic_task [req-69fcb584-5a00-410c-b093-216d5096ed7e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:04:09.048 143780 DEBUG oslo_concurrency.lockutils [req-8814eba5-1454-428a-8af8-2320b5a13d64 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:04:09.048 143780 DEBUG oslo_concurrency.lockutils [req-8814eba5-1454-428a-8af8-2320b5a13d64 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:04:10.057 143779 DEBUG oslo_service.periodic_task [req-0f11a4bb-df97-4b46-96ce-c9804bf3d874 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:04:10.061 143779 DEBUG oslo_concurrency.lockutils [req-d6ede5c8-e456-4d81-aa96-94327de5426b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:04:10.062 143779 DEBUG oslo_concurrency.lockutils [req-d6ede5c8-e456-4d81-aa96-94327de5426b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:04:18.107 143787 DEBUG oslo_service.periodic_task [req-646c8b0f-3a20-4885-8ace-451fa4f11c2b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:04:18.110 143787 DEBUG oslo_concurrency.lockutils [req-b84086ce-02fc-49db-9370-506ef2be41b1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:04:18.110 143787 DEBUG oslo_concurrency.lockutils [req-b84086ce-02fc-49db-9370-506ef2be41b1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:04:30.835 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:04:30.836 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:04:30.836 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:04:30.837 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:04:30.837 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:04:30.837 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:04:30.837 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:04:30.837 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:04:30.837 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:04:30.838 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:04:30.838 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:04:30.838 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:04:31.097 143781 DEBUG oslo_service.periodic_task [req-6dbe3ba5-9a2c-4797-ab70-46359429fd71 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:04:31.101 143781 DEBUG oslo_concurrency.lockutils [req-348db181-48cc-4435-b0c9-8d35b326384e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:04:31.101 143781 DEBUG oslo_concurrency.lockutils [req-348db181-48cc-4435-b0c9-8d35b326384e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:04:39.053 143780 DEBUG oslo_service.periodic_task [req-8814eba5-1454-428a-8af8-2320b5a13d64 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:04:39.057 143780 DEBUG oslo_concurrency.lockutils [req-4d17d25d-7770-4320-82cb-782d3d3f2e94 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:04:39.057 143780 DEBUG oslo_concurrency.lockutils [req-4d17d25d-7770-4320-82cb-782d3d3f2e94 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:04:41.057 143779 DEBUG oslo_service.periodic_task [req-d6ede5c8-e456-4d81-aa96-94327de5426b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:04:41.061 143779 DEBUG oslo_concurrency.lockutils [req-59b679c4-d7a1-447c-a4e1-d3ec2174a953 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:04:41.061 143779 DEBUG oslo_concurrency.lockutils [req-59b679c4-d7a1-447c-a4e1-d3ec2174a953 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:04:49.084 143787 DEBUG oslo_service.periodic_task [req-b84086ce-02fc-49db-9370-506ef2be41b1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:04:49.088 143787 DEBUG oslo_concurrency.lockutils [req-66b22d93-eb9a-44d4-8b3f-9d46adf17bd0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:04:49.088 143787 DEBUG oslo_concurrency.lockutils [req-66b22d93-eb9a-44d4-8b3f-9d46adf17bd0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:05:01.112 143781 DEBUG oslo_service.periodic_task [req-348db181-48cc-4435-b0c9-8d35b326384e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:05:01.116 143781 DEBUG oslo_concurrency.lockutils [req-2a343859-48b9-4fb5-b6a8-24300cd3ec28 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:05:01.116 143781 DEBUG oslo_concurrency.lockutils [req-2a343859-48b9-4fb5-b6a8-24300cd3ec28 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:05:10.044 143780 DEBUG oslo_service.periodic_task [req-4d17d25d-7770-4320-82cb-782d3d3f2e94 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:05:10.048 143780 DEBUG oslo_concurrency.lockutils [req-b8efeb8e-8241-4aa6-9872-3c6adb3db4a1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:05:10.048 143780 DEBUG oslo_concurrency.lockutils [req-b8efeb8e-8241-4aa6-9872-3c6adb3db4a1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:05:12.057 143779 DEBUG oslo_service.periodic_task [req-59b679c4-d7a1-447c-a4e1-d3ec2174a953 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:05:12.061 143779 DEBUG oslo_concurrency.lockutils [req-5c59ccd2-c82d-413d-ab1e-f6bea59ecdd9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:05:12.062 143779 DEBUG oslo_concurrency.lockutils [req-5c59ccd2-c82d-413d-ab1e-f6bea59ecdd9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:05:19.096 143787 DEBUG oslo_service.periodic_task [req-66b22d93-eb9a-44d4-8b3f-9d46adf17bd0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:05:19.100 143787 DEBUG oslo_concurrency.lockutils [req-5ac8f40f-4aae-4dd7-8bc5-b9b6989311e3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:05:19.100 143787 DEBUG oslo_concurrency.lockutils [req-5ac8f40f-4aae-4dd7-8bc5-b9b6989311e3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:05:30.101 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ceb2429f081742dcb4d64bdc92bb895e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:05:30.101 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ceb2429f081742dcb4d64bdc92bb895e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:05:30.101 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ceb2429f081742dcb4d64bdc92bb895e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:05:30.101 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ceb2429f081742dcb4d64bdc92bb895e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:05:30.101 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:30.101 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:30.101 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ceb2429f081742dcb4d64bdc92bb895e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:05:30.101 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:30.101 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:30.101 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ceb2429f081742dcb4d64bdc92bb895e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:05:30.101 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ceb2429f081742dcb4d64bdc92bb895e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:05:30.101 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ceb2429f081742dcb4d64bdc92bb895e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:05:30.101 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:30.101 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:05:30.101 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:30.102 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:05:30.102 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:30.102 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:30.102 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:05:30.102 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:05:30.102 143779 DEBUG oslo_concurrency.lockutils [req-15bd0a90-ccdb-413a-a19e-0eb0038ac265 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:05:30.102 143781 DEBUG oslo_concurrency.lockutils [req-15bd0a90-ccdb-413a-a19e-0eb0038ac265 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:05:30.102 143781 DEBUG nova.scheduler.host_manager [req-15bd0a90-ccdb-413a-a19e-0eb0038ac265 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:05:30.102 143779 DEBUG nova.scheduler.host_manager [req-15bd0a90-ccdb-413a-a19e-0eb0038ac265 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:05:30.102 143781 DEBUG oslo_concurrency.lockutils [req-15bd0a90-ccdb-413a-a19e-0eb0038ac265 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:05:30.103 143779 DEBUG oslo_concurrency.lockutils [req-15bd0a90-ccdb-413a-a19e-0eb0038ac265 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:05:30.102 143787 DEBUG oslo_concurrency.lockutils [req-15bd0a90-ccdb-413a-a19e-0eb0038ac265 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:05:30.102 143780 DEBUG oslo_concurrency.lockutils [req-15bd0a90-ccdb-413a-a19e-0eb0038ac265 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:05:30.103 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:05:30.103 143787 DEBUG nova.scheduler.host_manager [req-15bd0a90-ccdb-413a-a19e-0eb0038ac265 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:05:30.103 143780 DEBUG nova.scheduler.host_manager [req-15bd0a90-ccdb-413a-a19e-0eb0038ac265 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:05:30.103 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:05:30.103 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:30.103 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:30.103 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:05:30.103 143787 DEBUG oslo_concurrency.lockutils [req-15bd0a90-ccdb-413a-a19e-0eb0038ac265 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:05:30.103 143780 DEBUG oslo_concurrency.lockutils [req-15bd0a90-ccdb-413a-a19e-0eb0038ac265 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:05:30.103 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:05:30.103 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:05:30.103 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:05:30.104 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:30.104 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:30.104 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:05:30.104 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:05:31.104 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:05:31.104 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:31.104 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:05:31.105 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:05:31.105 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:31.105 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:05:31.105 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:05:31.105 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:31.105 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:05:31.105 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:05:31.105 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:31.106 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:05:31.121 143781 DEBUG oslo_service.periodic_task [req-2a343859-48b9-4fb5-b6a8-24300cd3ec28 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:05:31.125 143781 DEBUG oslo_concurrency.lockutils [req-96e72dbc-0585-4e57-901d-0f6ea5666223 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:05:31.125 143781 DEBUG oslo_concurrency.lockutils [req-96e72dbc-0585-4e57-901d-0f6ea5666223 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:05:33.106 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:05:33.107 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:33.107 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:05:33.107 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:05:33.107 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:05:33.108 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:33.108 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:05:33.108 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:33.108 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:05:33.108 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:05:33.108 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:33.108 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:05:37.109 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:05:37.110 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:37.110 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:05:37.111 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:05:37.111 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:05:37.111 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:05:37.111 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:37.111 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:37.111 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:05:37.111 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:05:37.111 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:37.111 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:05:40.053 143780 DEBUG oslo_service.periodic_task [req-b8efeb8e-8241-4aa6-9872-3c6adb3db4a1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:05:40.057 143780 DEBUG oslo_concurrency.lockutils [req-5ec516cf-3c4f-4629-b32b-52d38ba6c8ac - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:05:40.057 143780 DEBUG oslo_concurrency.lockutils [req-5ec516cf-3c4f-4629-b32b-52d38ba6c8ac - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:05:42.070 143779 DEBUG oslo_service.periodic_task [req-5c59ccd2-c82d-413d-ab1e-f6bea59ecdd9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:05:42.074 143779 DEBUG oslo_concurrency.lockutils [req-6d8cd2ae-7990-48a3-b609-79a7520dad86 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:05:42.074 143779 DEBUG oslo_concurrency.lockutils [req-6d8cd2ae-7990-48a3-b609-79a7520dad86 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:05:45.111 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:05:45.112 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:45.112 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:05:45.113 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:05:45.113 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:45.113 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:05:45.113 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:05:45.113 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:05:45.114 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:45.114 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:05:45.114 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:05:45.114 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:05:49.108 143787 DEBUG oslo_service.periodic_task [req-5ac8f40f-4aae-4dd7-8bc5-b9b6989311e3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:05:49.113 143787 DEBUG oslo_concurrency.lockutils [req-e55ba092-02d2-431c-8ffd-46760cb51899 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:05:49.114 143787 DEBUG oslo_concurrency.lockutils [req-e55ba092-02d2-431c-8ffd-46760cb51899 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:06:01.113 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:06:01.114 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:06:01.114 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:06:01.114 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:06:01.114 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:06:01.114 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:06:01.114 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:06:01.115 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:06:01.115 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:06:01.115 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:06:01.116 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:06:01.116 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:06:01.129 143781 DEBUG oslo_service.periodic_task [req-96e72dbc-0585-4e57-901d-0f6ea5666223 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:06:01.133 143781 DEBUG oslo_concurrency.lockutils [req-079de174-d9eb-424c-bf6c-16ca266c725b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:06:01.133 143781 DEBUG oslo_concurrency.lockutils [req-079de174-d9eb-424c-bf6c-16ca266c725b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:06:11.043 143780 DEBUG oslo_service.periodic_task [req-5ec516cf-3c4f-4629-b32b-52d38ba6c8ac - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:06:11.047 143780 DEBUG oslo_concurrency.lockutils [req-df2d1dde-56b9-4601-ab47-a7cc5e32a213 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:06:11.047 143780 DEBUG oslo_concurrency.lockutils [req-df2d1dde-56b9-4601-ab47-a7cc5e32a213 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:06:13.057 143779 DEBUG oslo_service.periodic_task [req-6d8cd2ae-7990-48a3-b609-79a7520dad86 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:06:13.061 143779 DEBUG oslo_concurrency.lockutils [req-5eb259c7-0d85-454a-a743-1ff7bfcd1e59 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:06:13.061 143779 DEBUG oslo_concurrency.lockutils [req-5eb259c7-0d85-454a-a743-1ff7bfcd1e59 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:06:19.121 143787 DEBUG oslo_service.periodic_task [req-e55ba092-02d2-431c-8ffd-46760cb51899 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:06:19.125 143787 DEBUG oslo_concurrency.lockutils [req-9a45c587-7dfd-4304-b0cc-c63aded7aa73 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:06:19.126 143787 DEBUG oslo_concurrency.lockutils [req-9a45c587-7dfd-4304-b0cc-c63aded7aa73 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:06:31.143 143781 DEBUG oslo_service.periodic_task [req-079de174-d9eb-424c-bf6c-16ca266c725b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:06:31.146 143781 DEBUG oslo_concurrency.lockutils [req-aef35d40-bc27-479e-aad8-425a8cc1bc7d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:06:31.146 143781 DEBUG oslo_concurrency.lockutils [req-aef35d40-bc27-479e-aad8-425a8cc1bc7d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:06:33.119 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:06:33.119 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:06:33.119 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:06:33.123 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:06:33.123 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:06:33.123 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:06:33.124 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:06:33.124 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:06:33.124 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:06:33.124 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:06:33.124 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:06:33.125 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:06:41.051 143780 DEBUG oslo_service.periodic_task [req-df2d1dde-56b9-4601-ab47-a7cc5e32a213 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:06:41.055 143780 DEBUG oslo_concurrency.lockutils [req-f632d27e-e18c-454c-b8ea-bb0d813001c4 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:06:41.056 143780 DEBUG oslo_concurrency.lockutils [req-f632d27e-e18c-454c-b8ea-bb0d813001c4 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:06:44.057 143779 DEBUG oslo_service.periodic_task [req-5eb259c7-0d85-454a-a743-1ff7bfcd1e59 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:06:44.061 143779 DEBUG oslo_concurrency.lockutils [req-3497536c-140a-452c-b255-22b4ccb51181 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:06:44.061 143779 DEBUG oslo_concurrency.lockutils [req-3497536c-140a-452c-b255-22b4ccb51181 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:06:50.083 143787 DEBUG oslo_service.periodic_task [req-9a45c587-7dfd-4304-b0cc-c63aded7aa73 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:06:50.088 143787 DEBUG oslo_concurrency.lockutils [req-886f166d-6a2d-414e-b5d5-d2a27d5d7499 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:06:50.088 143787 DEBUG oslo_concurrency.lockutils [req-886f166d-6a2d-414e-b5d5-d2a27d5d7499 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:07:01.157 143781 DEBUG oslo_service.periodic_task [req-aef35d40-bc27-479e-aad8-425a8cc1bc7d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:07:01.161 143781 DEBUG oslo_concurrency.lockutils [req-ec517080-5dc6-4ac5-bdd2-4e7809f50aed - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:07:01.162 143781 DEBUG oslo_concurrency.lockutils [req-ec517080-5dc6-4ac5-bdd2-4e7809f50aed - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:07:12.044 143780 DEBUG oslo_service.periodic_task [req-f632d27e-e18c-454c-b8ea-bb0d813001c4 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:07:12.049 143780 DEBUG oslo_concurrency.lockutils [req-3dae348a-9f1b-488d-af3a-b3c67b4b301f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:07:12.050 143780 DEBUG oslo_concurrency.lockutils [req-3dae348a-9f1b-488d-af3a-b3c67b4b301f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:07:14.071 143779 DEBUG oslo_service.periodic_task [req-3497536c-140a-452c-b255-22b4ccb51181 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:07:14.075 143779 DEBUG oslo_concurrency.lockutils [req-5b4fc8cf-b79f-451e-a7e7-6a2001ac105e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:07:14.076 143779 DEBUG oslo_concurrency.lockutils [req-5b4fc8cf-b79f-451e-a7e7-6a2001ac105e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:07:20.098 143787 DEBUG oslo_service.periodic_task [req-886f166d-6a2d-414e-b5d5-d2a27d5d7499 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:07:20.102 143787 DEBUG oslo_concurrency.lockutils [req-86542d72-bbd0-4462-beca-59c9db09fdd6 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:07:20.102 143787 DEBUG oslo_concurrency.lockutils [req-86542d72-bbd0-4462-beca-59c9db09fdd6 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:07:29.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7b07350d770a465abdd9f7e1057464c8 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:07:29.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7b07350d770a465abdd9f7e1057464c8 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:07:29.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7b07350d770a465abdd9f7e1057464c8 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:07:29.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7b07350d770a465abdd9f7e1057464c8 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:07:29.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:29.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:29.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:29.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:29.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7b07350d770a465abdd9f7e1057464c8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:07:29.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7b07350d770a465abdd9f7e1057464c8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:07:29.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7b07350d770a465abdd9f7e1057464c8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:07:29.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7b07350d770a465abdd9f7e1057464c8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:07:29.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:29.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:29.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:29.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:29.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:07:29.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:07:29.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:07:29.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:07:29.813 143787 DEBUG oslo_concurrency.lockutils [req-ac67dd6e-15ec-4e9e-84f2-010226df3cd8 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:07:29.814 143781 DEBUG oslo_concurrency.lockutils [req-ac67dd6e-15ec-4e9e-84f2-010226df3cd8 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:07:29.814 143780 DEBUG oslo_concurrency.lockutils [req-ac67dd6e-15ec-4e9e-84f2-010226df3cd8 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:07:29.814 143779 DEBUG oslo_concurrency.lockutils [req-ac67dd6e-15ec-4e9e-84f2-010226df3cd8 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:07:29.814 143787 DEBUG nova.scheduler.host_manager [req-ac67dd6e-15ec-4e9e-84f2-010226df3cd8 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:07:29.814 143781 DEBUG nova.scheduler.host_manager [req-ac67dd6e-15ec-4e9e-84f2-010226df3cd8 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:07:29.814 143780 DEBUG nova.scheduler.host_manager [req-ac67dd6e-15ec-4e9e-84f2-010226df3cd8 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:07:29.814 143779 DEBUG nova.scheduler.host_manager [req-ac67dd6e-15ec-4e9e-84f2-010226df3cd8 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:07:29.814 143787 DEBUG oslo_concurrency.lockutils [req-ac67dd6e-15ec-4e9e-84f2-010226df3cd8 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:07:29.814 143781 DEBUG oslo_concurrency.lockutils [req-ac67dd6e-15ec-4e9e-84f2-010226df3cd8 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:07:29.814 143779 DEBUG oslo_concurrency.lockutils [req-ac67dd6e-15ec-4e9e-84f2-010226df3cd8 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:07:29.814 143780 DEBUG oslo_concurrency.lockutils [req-ac67dd6e-15ec-4e9e-84f2-010226df3cd8 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:07:29.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:07:29.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:07:29.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:07:29.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:07:29.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:29.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:29.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:07:29.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:29.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:29.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:07:29.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:07:29.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:07:30.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:07:30.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:07:30.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:07:30.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:07:30.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:30.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:30.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:07:30.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:07:30.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:30.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:30.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:07:30.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:07:31.166 143781 DEBUG oslo_service.periodic_task [req-ec517080-5dc6-4ac5-bdd2-4e7809f50aed - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:07:31.170 143781 DEBUG oslo_concurrency.lockutils [req-91b25d2e-83ea-4edf-a5f3-4bf82d162b7d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:07:31.171 143781 DEBUG oslo_concurrency.lockutils [req-91b25d2e-83ea-4edf-a5f3-4bf82d162b7d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:07:32.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:07:32.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:07:32.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:32.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:07:32.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:32.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:07:32.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:07:32.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:32.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:07:32.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:07:32.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:32.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:07:36.821 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:07:36.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:36.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:07:36.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:07:36.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:07:36.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:36.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:07:36.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:36.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:07:36.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:07:36.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:36.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:07:42.060 143780 DEBUG oslo_service.periodic_task [req-3dae348a-9f1b-488d-af3a-b3c67b4b301f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:07:42.064 143780 DEBUG oslo_concurrency.lockutils [req-2b58f35e-e978-4a0c-a5d4-0fa358c1ba15 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:07:42.064 143780 DEBUG oslo_concurrency.lockutils [req-2b58f35e-e978-4a0c-a5d4-0fa358c1ba15 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:07:44.087 143779 DEBUG oslo_service.periodic_task [req-5b4fc8cf-b79f-451e-a7e7-6a2001ac105e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:07:44.091 143779 DEBUG oslo_concurrency.lockutils [req-00726198-fa11-4c3c-977d-44720a1f78e2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:07:44.091 143779 DEBUG oslo_concurrency.lockutils [req-00726198-fa11-4c3c-977d-44720a1f78e2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:07:44.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:07:44.823 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:07:44.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:07:44.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:44.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:44.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:07:44.824 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:44.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:07:44.824 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:07:44.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:07:44.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:07:44.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:07:51.083 143787 DEBUG oslo_service.periodic_task [req-86542d72-bbd0-4462-beca-59c9db09fdd6 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:07:51.087 143787 DEBUG oslo_concurrency.lockutils [req-2fca939a-3157-4e8e-8de2-ae414734e3b2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:07:51.088 143787 DEBUG oslo_concurrency.lockutils [req-2fca939a-3157-4e8e-8de2-ae414734e3b2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:08:00.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:08:00.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:08:00.825 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:08:00.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:08:00.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:08:00.825 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:08:00.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:08:00.826 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:08:00.826 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:08:00.826 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:08:00.826 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:08:00.826 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:08:01.175 143781 DEBUG oslo_service.periodic_task [req-91b25d2e-83ea-4edf-a5f3-4bf82d162b7d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:08:01.179 143781 DEBUG oslo_concurrency.lockutils [req-6c1656e2-f1ff-42c2-b223-e8e148d89027 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:08:01.180 143781 DEBUG oslo_concurrency.lockutils [req-6c1656e2-f1ff-42c2-b223-e8e148d89027 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:08:12.073 143780 DEBUG oslo_service.periodic_task [req-2b58f35e-e978-4a0c-a5d4-0fa358c1ba15 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:08:12.077 143780 DEBUG oslo_concurrency.lockutils [req-74af388f-0a5d-44a9-8bca-52dcc3d06249 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:08:12.077 143780 DEBUG oslo_concurrency.lockutils [req-74af388f-0a5d-44a9-8bca-52dcc3d06249 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:08:15.057 143779 DEBUG oslo_service.periodic_task [req-00726198-fa11-4c3c-977d-44720a1f78e2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:08:15.062 143779 DEBUG oslo_concurrency.lockutils [req-e0c8d830-9890-4d80-a806-a1a393774e03 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:08:15.062 143779 DEBUG oslo_concurrency.lockutils [req-e0c8d830-9890-4d80-a806-a1a393774e03 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:08:21.093 143787 DEBUG oslo_service.periodic_task [req-2fca939a-3157-4e8e-8de2-ae414734e3b2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:08:21.097 143787 DEBUG oslo_concurrency.lockutils [req-fa0a4cb0-3588-4bfd-8c94-60d1dd5cf6d4 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:08:21.097 143787 DEBUG oslo_concurrency.lockutils [req-fa0a4cb0-3588-4bfd-8c94-60d1dd5cf6d4 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:08:32.042 143781 DEBUG oslo_service.periodic_task [req-6c1656e2-f1ff-42c2-b223-e8e148d89027 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:08:32.046 143781 DEBUG oslo_concurrency.lockutils [req-1d6af024-a263-4c9b-9d47-b9420495f832 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:08:32.046 143781 DEBUG oslo_concurrency.lockutils [req-1d6af024-a263-4c9b-9d47-b9420495f832 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:08:32.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:08:32.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:08:32.828 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:08:32.833 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:08:32.834 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:08:32.834 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:08:32.834 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:08:32.835 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:08:32.835 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:08:32.835 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:08:32.836 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:08:32.836 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:08:42.085 143780 DEBUG oslo_service.periodic_task [req-74af388f-0a5d-44a9-8bca-52dcc3d06249 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:08:42.089 143780 DEBUG oslo_concurrency.lockutils [req-d61f6875-4c14-4264-b344-33d100c57eb9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:08:42.089 143780 DEBUG oslo_concurrency.lockutils [req-d61f6875-4c14-4264-b344-33d100c57eb9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:08:45.068 143779 DEBUG oslo_service.periodic_task [req-e0c8d830-9890-4d80-a806-a1a393774e03 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:08:45.071 143779 DEBUG oslo_concurrency.lockutils [req-d432e900-218a-4903-8ce9-27e4bb89153a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:08:45.071 143779 DEBUG oslo_concurrency.lockutils [req-d432e900-218a-4903-8ce9-27e4bb89153a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:08:51.104 143787 DEBUG oslo_service.periodic_task [req-fa0a4cb0-3588-4bfd-8c94-60d1dd5cf6d4 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:08:51.108 143787 DEBUG oslo_concurrency.lockutils [req-41e9945b-cdeb-409d-9838-fc3b6b680eed - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:08:51.108 143787 DEBUG oslo_concurrency.lockutils [req-41e9945b-cdeb-409d-9838-fc3b6b680eed - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:09:02.056 143781 DEBUG oslo_service.periodic_task [req-1d6af024-a263-4c9b-9d47-b9420495f832 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:09:02.060 143781 DEBUG oslo_concurrency.lockutils [req-f70a45fa-fa18-40b6-b0c7-2ea2817f0d23 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:09:02.060 143781 DEBUG oslo_concurrency.lockutils [req-f70a45fa-fa18-40b6-b0c7-2ea2817f0d23 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:09:12.095 143780 DEBUG oslo_service.periodic_task [req-d61f6875-4c14-4264-b344-33d100c57eb9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:09:12.099 143780 DEBUG oslo_concurrency.lockutils [req-d2b83ed3-63c7-4ef1-b34c-a57e249ac515 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:09:12.099 143780 DEBUG oslo_concurrency.lockutils [req-d2b83ed3-63c7-4ef1-b34c-a57e249ac515 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:09:15.076 143779 DEBUG oslo_service.periodic_task [req-d432e900-218a-4903-8ce9-27e4bb89153a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:09:15.080 143779 DEBUG oslo_concurrency.lockutils [req-3c769f0c-f299-4aef-87e8-7ece5fd998b1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:09:15.080 143779 DEBUG oslo_concurrency.lockutils [req-3c769f0c-f299-4aef-87e8-7ece5fd998b1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:09:21.117 143787 DEBUG oslo_service.periodic_task [req-41e9945b-cdeb-409d-9838-fc3b6b680eed - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:09:21.121 143787 DEBUG oslo_concurrency.lockutils [req-13fcfafb-8796-4cce-a53e-e3780ec06353 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:09:21.121 143787 DEBUG oslo_concurrency.lockutils [req-13fcfafb-8796-4cce-a53e-e3780ec06353 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:09:33.042 143781 DEBUG oslo_service.periodic_task [req-f70a45fa-fa18-40b6-b0c7-2ea2817f0d23 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:09:33.046 143781 DEBUG oslo_concurrency.lockutils [req-5a12171f-afd1-4326-85ac-22529e99e17f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:09:33.046 143781 DEBUG oslo_concurrency.lockutils [req-5a12171f-afd1-4326-85ac-22529e99e17f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:09:33.810 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 91047fbf28d64b99b2f06dd245a30c20 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:09:33.810 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 91047fbf28d64b99b2f06dd245a30c20 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:09:33.810 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 91047fbf28d64b99b2f06dd245a30c20 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:09:33.810 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 91047fbf28d64b99b2f06dd245a30c20 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:09:33.810 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:33.810 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:33.810 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:33.810 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 91047fbf28d64b99b2f06dd245a30c20 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:09:33.810 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 91047fbf28d64b99b2f06dd245a30c20 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:09:33.810 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:33.810 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 91047fbf28d64b99b2f06dd245a30c20 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:09:33.810 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 91047fbf28d64b99b2f06dd245a30c20 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:09:33.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:33.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:33.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:09:33.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:33.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:09:33.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:09:33.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:33.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:09:33.811 143787 DEBUG oslo_concurrency.lockutils [req-a51f4f7a-e651-4886-99b1-b57668fdb051 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:09:33.811 143779 DEBUG oslo_concurrency.lockutils [req-a51f4f7a-e651-4886-99b1-b57668fdb051 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:09:33.811 143781 DEBUG oslo_concurrency.lockutils [req-a51f4f7a-e651-4886-99b1-b57668fdb051 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:09:33.811 143787 DEBUG nova.scheduler.host_manager [req-a51f4f7a-e651-4886-99b1-b57668fdb051 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:09:33.811 143779 DEBUG nova.scheduler.host_manager [req-a51f4f7a-e651-4886-99b1-b57668fdb051 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:09:33.812 143781 DEBUG nova.scheduler.host_manager [req-a51f4f7a-e651-4886-99b1-b57668fdb051 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:09:33.812 143787 DEBUG oslo_concurrency.lockutils [req-a51f4f7a-e651-4886-99b1-b57668fdb051 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:09:33.812 143780 DEBUG oslo_concurrency.lockutils [req-a51f4f7a-e651-4886-99b1-b57668fdb051 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:09:33.812 143779 DEBUG oslo_concurrency.lockutils [req-a51f4f7a-e651-4886-99b1-b57668fdb051 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:09:33.812 143781 DEBUG oslo_concurrency.lockutils [req-a51f4f7a-e651-4886-99b1-b57668fdb051 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:09:33.812 143780 DEBUG nova.scheduler.host_manager [req-a51f4f7a-e651-4886-99b1-b57668fdb051 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:09:33.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:09:33.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:33.812 143780 DEBUG oslo_concurrency.lockutils [req-a51f4f7a-e651-4886-99b1-b57668fdb051 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:09:33.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:09:33.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:09:33.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:33.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:09:33.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:09:33.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:33.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:09:33.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:09:33.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:33.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:09:34.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:09:34.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:34.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:09:34.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:09:34.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:34.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:09:34.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:09:34.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:09:34.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:34.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:09:34.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:34.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:09:36.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:09:36.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:36.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:09:36.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:09:36.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:36.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:09:36.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:09:36.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:09:36.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:36.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:36.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:09:36.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:09:40.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:09:40.821 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:40.821 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:09:40.821 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:09:40.821 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:40.821 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:09:40.821 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:09:40.821 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:09:40.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:40.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:09:40.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:40.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:09:43.044 143780 DEBUG oslo_service.periodic_task [req-d2b83ed3-63c7-4ef1-b34c-a57e249ac515 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:09:43.048 143780 DEBUG oslo_concurrency.lockutils [req-676e5ed3-7941-4cc6-964f-40766ffaee98 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:09:43.048 143780 DEBUG oslo_concurrency.lockutils [req-676e5ed3-7941-4cc6-964f-40766ffaee98 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:09:45.085 143779 DEBUG oslo_service.periodic_task [req-3c769f0c-f299-4aef-87e8-7ece5fd998b1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:09:45.089 143779 DEBUG oslo_concurrency.lockutils [req-47db3776-8003-4bb6-8604-52dd2224a961 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:09:45.089 143779 DEBUG oslo_concurrency.lockutils [req-47db3776-8003-4bb6-8604-52dd2224a961 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:09:48.825 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:09:48.826 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:48.826 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:09:48.826 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:09:48.826 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:48.826 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:09:48.826 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:09:48.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:09:48.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:48.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:09:48.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:09:48.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:09:51.128 143787 DEBUG oslo_service.periodic_task [req-13fcfafb-8796-4cce-a53e-e3780ec06353 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:09:51.132 143787 DEBUG oslo_concurrency.lockutils [req-49393810-1b76-42e1-b31a-5c8d22f26d04 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:09:51.132 143787 DEBUG oslo_concurrency.lockutils [req-49393810-1b76-42e1-b31a-5c8d22f26d04 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:10:04.041 143781 DEBUG oslo_service.periodic_task [req-5a12171f-afd1-4326-85ac-22529e99e17f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:10:04.045 143781 DEBUG oslo_concurrency.lockutils [req-53e5a1eb-ca77-4fc6-a429-62b2c3ef15e9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:10:04.045 143781 DEBUG oslo_concurrency.lockutils [req-53e5a1eb-ca77-4fc6-a429-62b2c3ef15e9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:10:04.827 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:10:04.827 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:10:04.828 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:10:04.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:10:04.828 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:10:04.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:10:04.828 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:10:04.828 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:10:04.828 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:10:04.829 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:10:04.829 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:10:04.829 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:10:13.056 143780 DEBUG oslo_service.periodic_task [req-676e5ed3-7941-4cc6-964f-40766ffaee98 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:10:13.060 143780 DEBUG oslo_concurrency.lockutils [req-77599462-a04f-4f66-b5c5-3abc2fb3152b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:10:13.060 143780 DEBUG oslo_concurrency.lockutils [req-77599462-a04f-4f66-b5c5-3abc2fb3152b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:10:15.094 143779 DEBUG oslo_service.periodic_task [req-47db3776-8003-4bb6-8604-52dd2224a961 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:10:15.098 143779 DEBUG oslo_concurrency.lockutils [req-09db3205-f5f1-463c-a2a7-ffd94c94b53b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:10:15.099 143779 DEBUG oslo_concurrency.lockutils [req-09db3205-f5f1-463c-a2a7-ffd94c94b53b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:10:22.083 143787 DEBUG oslo_service.periodic_task [req-49393810-1b76-42e1-b31a-5c8d22f26d04 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:10:22.087 143787 DEBUG oslo_concurrency.lockutils [req-162db517-3a6b-40af-92f6-aefa92d99fd0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:10:22.087 143787 DEBUG oslo_concurrency.lockutils [req-162db517-3a6b-40af-92f6-aefa92d99fd0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:10:34.059 143781 DEBUG oslo_service.periodic_task [req-53e5a1eb-ca77-4fc6-a429-62b2c3ef15e9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:10:34.063 143781 DEBUG oslo_concurrency.lockutils [req-46c26b31-189c-491a-84b5-df67b1d8d4c8 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:10:34.064 143781 DEBUG oslo_concurrency.lockutils [req-46c26b31-189c-491a-84b5-df67b1d8d4c8 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:10:36.829 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:10:36.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:10:36.830 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:10:36.830 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:10:36.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:10:36.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:10:36.831 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:10:36.831 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:10:36.831 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:10:36.832 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:10:36.832 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:10:36.833 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:10:43.068 143780 DEBUG oslo_service.periodic_task [req-77599462-a04f-4f66-b5c5-3abc2fb3152b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:10:43.072 143780 DEBUG oslo_concurrency.lockutils [req-ed808f21-693f-4beb-9520-24af573fc822 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:10:43.073 143780 DEBUG oslo_concurrency.lockutils [req-ed808f21-693f-4beb-9520-24af573fc822 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:10:45.103 143779 DEBUG oslo_service.periodic_task [req-09db3205-f5f1-463c-a2a7-ffd94c94b53b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:10:45.107 143779 DEBUG oslo_concurrency.lockutils [req-4d83a93a-46eb-4de5-bb5d-30da2f178acb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:10:45.107 143779 DEBUG oslo_concurrency.lockutils [req-4d83a93a-46eb-4de5-bb5d-30da2f178acb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:10:53.083 143787 DEBUG oslo_service.periodic_task [req-162db517-3a6b-40af-92f6-aefa92d99fd0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:10:53.087 143787 DEBUG oslo_concurrency.lockutils [req-1444dfd1-41b2-443d-b457-726262b47836 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:10:53.088 143787 DEBUG oslo_concurrency.lockutils [req-1444dfd1-41b2-443d-b457-726262b47836 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:11:05.042 143781 DEBUG oslo_service.periodic_task [req-46c26b31-189c-491a-84b5-df67b1d8d4c8 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:11:05.045 143781 DEBUG oslo_concurrency.lockutils [req-29addb93-d2ed-4eda-a08f-539787229132 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:11:05.045 143781 DEBUG oslo_concurrency.lockutils [req-29addb93-d2ed-4eda-a08f-539787229132 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:11:13.081 143780 DEBUG oslo_service.periodic_task [req-ed808f21-693f-4beb-9520-24af573fc822 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:11:13.085 143780 DEBUG oslo_concurrency.lockutils [req-17b3ff89-4ba6-4d65-963d-bb964f918f07 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:11:13.086 143780 DEBUG oslo_concurrency.lockutils [req-17b3ff89-4ba6-4d65-963d-bb964f918f07 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:11:15.112 143779 DEBUG oslo_service.periodic_task [req-4d83a93a-46eb-4de5-bb5d-30da2f178acb - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:11:15.116 143779 DEBUG oslo_concurrency.lockutils [req-c1b38320-e52f-43f2-bd38-c9e247cf1cb8 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:11:15.116 143779 DEBUG oslo_concurrency.lockutils [req-c1b38320-e52f-43f2-bd38-c9e247cf1cb8 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:11:23.098 143787 DEBUG oslo_service.periodic_task [req-1444dfd1-41b2-443d-b457-726262b47836 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:11:23.103 143787 DEBUG oslo_concurrency.lockutils [req-5a60ab8f-f9ef-440c-bd4d-86afa642aafd - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:11:23.104 143787 DEBUG oslo_concurrency.lockutils [req-5a60ab8f-f9ef-440c-bd4d-86afa642aafd - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:11:34.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7df75373da354bd6aefe7edf42dc45ae __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:11:34.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7df75373da354bd6aefe7edf42dc45ae __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:11:34.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7df75373da354bd6aefe7edf42dc45ae __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:11:34.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7df75373da354bd6aefe7edf42dc45ae __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:11:34.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:34.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:34.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:34.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:34.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7df75373da354bd6aefe7edf42dc45ae poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:11:34.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7df75373da354bd6aefe7edf42dc45ae poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:11:34.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7df75373da354bd6aefe7edf42dc45ae poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:11:34.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7df75373da354bd6aefe7edf42dc45ae poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:11:34.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:34.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:34.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:34.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:34.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:11:34.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:11:34.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:11:34.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:11:34.814 143780 DEBUG oslo_concurrency.lockutils [req-1459428b-ba70-4ad1-bebb-d7f615b8609c - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:11:34.814 143781 DEBUG oslo_concurrency.lockutils [req-1459428b-ba70-4ad1-bebb-d7f615b8609c - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:11:34.814 143787 DEBUG oslo_concurrency.lockutils [req-1459428b-ba70-4ad1-bebb-d7f615b8609c - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:11:34.814 143779 DEBUG oslo_concurrency.lockutils [req-1459428b-ba70-4ad1-bebb-d7f615b8609c - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:11:34.814 143780 DEBUG nova.scheduler.host_manager [req-1459428b-ba70-4ad1-bebb-d7f615b8609c - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:11:34.814 143781 DEBUG nova.scheduler.host_manager [req-1459428b-ba70-4ad1-bebb-d7f615b8609c - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:11:34.814 143779 DEBUG nova.scheduler.host_manager [req-1459428b-ba70-4ad1-bebb-d7f615b8609c - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:11:34.814 143787 DEBUG nova.scheduler.host_manager [req-1459428b-ba70-4ad1-bebb-d7f615b8609c - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:11:34.814 143780 DEBUG oslo_concurrency.lockutils [req-1459428b-ba70-4ad1-bebb-d7f615b8609c - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:11:34.814 143781 DEBUG oslo_concurrency.lockutils [req-1459428b-ba70-4ad1-bebb-d7f615b8609c - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:11:34.815 143787 DEBUG oslo_concurrency.lockutils [req-1459428b-ba70-4ad1-bebb-d7f615b8609c - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:11:34.815 143779 DEBUG oslo_concurrency.lockutils [req-1459428b-ba70-4ad1-bebb-d7f615b8609c - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:11:34.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:11:34.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:34.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:11:34.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:11:34.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:11:34.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:11:34.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:34.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:34.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:34.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:11:34.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:11:34.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:11:35.050 143781 DEBUG oslo_service.periodic_task [req-29addb93-d2ed-4eda-a08f-539787229132 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:11:35.054 143781 DEBUG oslo_concurrency.lockutils [req-215208b8-6228-4ffe-95c4-177600d199ba - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:11:35.054 143781 DEBUG oslo_concurrency.lockutils [req-215208b8-6228-4ffe-95c4-177600d199ba - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:11:35.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:11:35.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:35.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:11:35.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:11:35.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:35.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:11:35.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:11:35.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:35.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:11:35.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:11:35.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:35.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:11:37.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:11:37.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:37.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:11:37.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:11:37.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:37.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:11:37.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:11:37.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:11:37.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:37.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:11:37.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:37.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:11:41.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:11:41.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:41.823 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:11:41.823 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:11:41.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:11:41.823 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:41.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:41.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:11:41.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:11:41.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:11:41.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:41.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:11:44.044 143780 DEBUG oslo_service.periodic_task [req-17b3ff89-4ba6-4d65-963d-bb964f918f07 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:11:44.048 143780 DEBUG oslo_concurrency.lockutils [req-bc7dd4b4-8438-4e3d-8f31-9b5b6bf393d3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:11:44.049 143780 DEBUG oslo_concurrency.lockutils [req-bc7dd4b4-8438-4e3d-8f31-9b5b6bf393d3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:11:46.057 143779 DEBUG oslo_service.periodic_task [req-c1b38320-e52f-43f2-bd38-c9e247cf1cb8 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:11:46.061 143779 DEBUG oslo_concurrency.lockutils [req-759c5970-9fed-41ec-92e9-b9e5d11e2bab - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:11:46.061 143779 DEBUG oslo_concurrency.lockutils [req-759c5970-9fed-41ec-92e9-b9e5d11e2bab - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:11:49.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:11:49.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:49.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:11:49.829 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:11:49.829 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:11:49.830 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:49.829 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:11:49.830 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:11:49.830 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:49.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:11:49.830 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:11:49.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:11:54.084 143787 DEBUG oslo_service.periodic_task [req-5a60ab8f-f9ef-440c-bd4d-86afa642aafd - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:11:54.088 143787 DEBUG oslo_concurrency.lockutils [req-c697f8e7-a7d6-4a04-b11f-307fded3a631 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:11:54.088 143787 DEBUG oslo_concurrency.lockutils [req-c697f8e7-a7d6-4a04-b11f-307fded3a631 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:12:05.058 143781 DEBUG oslo_service.periodic_task [req-215208b8-6228-4ffe-95c4-177600d199ba - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:12:05.062 143781 DEBUG oslo_concurrency.lockutils [req-bd45ebfe-db79-46a8-8819-968d7645dab7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:12:05.062 143781 DEBUG oslo_concurrency.lockutils [req-bd45ebfe-db79-46a8-8819-968d7645dab7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:12:05.831 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:12:05.831 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:12:05.831 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:12:05.831 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:12:05.832 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:12:05.832 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:12:05.831 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:12:05.832 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:12:05.832 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:12:05.832 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:12:05.832 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:12:05.832 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:12:14.058 143780 DEBUG oslo_service.periodic_task [req-bc7dd4b4-8438-4e3d-8f31-9b5b6bf393d3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:12:14.062 143780 DEBUG oslo_concurrency.lockutils [req-5ceb5efd-3a94-4cc8-a8d7-522e6f153ae3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:12:14.062 143780 DEBUG oslo_concurrency.lockutils [req-5ceb5efd-3a94-4cc8-a8d7-522e6f153ae3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:12:16.067 143779 DEBUG oslo_service.periodic_task [req-759c5970-9fed-41ec-92e9-b9e5d11e2bab - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:12:16.071 143779 DEBUG oslo_concurrency.lockutils [req-51bd01af-0e34-4648-a75f-afda1a02cd1b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:12:16.072 143779 DEBUG oslo_concurrency.lockutils [req-51bd01af-0e34-4648-a75f-afda1a02cd1b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:12:25.083 143787 DEBUG oslo_service.periodic_task [req-c697f8e7-a7d6-4a04-b11f-307fded3a631 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:12:25.087 143787 DEBUG oslo_concurrency.lockutils [req-dd4590b8-5982-446a-b5db-943714536b25 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:12:25.087 143787 DEBUG oslo_concurrency.lockutils [req-dd4590b8-5982-446a-b5db-943714536b25 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:12:35.067 143781 DEBUG oslo_service.periodic_task [req-bd45ebfe-db79-46a8-8819-968d7645dab7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:12:35.071 143781 DEBUG oslo_concurrency.lockutils [req-dfffbf94-103d-4fce-b412-52bb8a0a484f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:12:35.071 143781 DEBUG oslo_concurrency.lockutils [req-dfffbf94-103d-4fce-b412-52bb8a0a484f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:12:37.833 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:12:37.834 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:12:37.834 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:12:37.835 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:12:37.835 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:12:37.835 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:12:37.835 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:12:37.835 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:12:37.835 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:12:37.836 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:12:37.837 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:12:37.837 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:12:45.044 143780 DEBUG oslo_service.periodic_task [req-5ceb5efd-3a94-4cc8-a8d7-522e6f153ae3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:12:45.048 143780 DEBUG oslo_concurrency.lockutils [req-85f539ca-89a1-4b02-8527-1fcd6fe0fd07 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:12:45.048 143780 DEBUG oslo_concurrency.lockutils [req-85f539ca-89a1-4b02-8527-1fcd6fe0fd07 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:12:46.078 143779 DEBUG oslo_service.periodic_task [req-51bd01af-0e34-4648-a75f-afda1a02cd1b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:12:46.081 143779 DEBUG oslo_concurrency.lockutils [req-6b52447f-4cf6-40df-96bd-b41e7ac11afe - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:12:46.081 143779 DEBUG oslo_concurrency.lockutils [req-6b52447f-4cf6-40df-96bd-b41e7ac11afe - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:12:56.084 143787 DEBUG oslo_service.periodic_task [req-dd4590b8-5982-446a-b5db-943714536b25 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:12:56.088 143787 DEBUG oslo_concurrency.lockutils [req-0ca3fa97-51f8-4e46-9024-e9ff9d81a2cb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:12:56.088 143787 DEBUG oslo_concurrency.lockutils [req-0ca3fa97-51f8-4e46-9024-e9ff9d81a2cb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:13:06.042 143781 DEBUG oslo_service.periodic_task [req-dfffbf94-103d-4fce-b412-52bb8a0a484f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:13:06.046 143781 DEBUG oslo_concurrency.lockutils [req-e24bc723-0aa9-4f69-9dbd-1fd92065202b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:13:06.047 143781 DEBUG oslo_concurrency.lockutils [req-e24bc723-0aa9-4f69-9dbd-1fd92065202b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:13:16.044 143780 DEBUG oslo_service.periodic_task [req-85f539ca-89a1-4b02-8527-1fcd6fe0fd07 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:13:16.048 143780 DEBUG oslo_concurrency.lockutils [req-b6096355-0f19-4360-8fb3-7ba3964aed3d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:13:16.049 143780 DEBUG oslo_concurrency.lockutils [req-b6096355-0f19-4360-8fb3-7ba3964aed3d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:13:16.087 143779 DEBUG oslo_service.periodic_task [req-6b52447f-4cf6-40df-96bd-b41e7ac11afe - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:13:16.091 143779 DEBUG oslo_concurrency.lockutils [req-4680678b-39a8-42ac-9e82-c36411e80715 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:13:16.091 143779 DEBUG oslo_concurrency.lockutils [req-4680678b-39a8-42ac-9e82-c36411e80715 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:13:26.093 143787 DEBUG oslo_service.periodic_task [req-0ca3fa97-51f8-4e46-9024-e9ff9d81a2cb - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:13:26.099 143787 DEBUG oslo_concurrency.lockutils [req-2e813997-7e22-4d35-a2c7-93ce453074ca - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:13:26.099 143787 DEBUG oslo_concurrency.lockutils [req-2e813997-7e22-4d35-a2c7-93ce453074ca - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:13:36.052 143781 DEBUG oslo_service.periodic_task [req-e24bc723-0aa9-4f69-9dbd-1fd92065202b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:13:36.057 143781 DEBUG oslo_concurrency.lockutils [req-366b5110-448d-4472-83d1-0c8928c15e93 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:13:36.057 143781 DEBUG oslo_concurrency.lockutils [req-366b5110-448d-4472-83d1-0c8928c15e93 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:13:36.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1debc17cf7264cbcab30870af5d9c778 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:13:36.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1debc17cf7264cbcab30870af5d9c778 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:13:36.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1debc17cf7264cbcab30870af5d9c778 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:13:36.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1debc17cf7264cbcab30870af5d9c778 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:13:36.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:36.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:36.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1debc17cf7264cbcab30870af5d9c778 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:13:36.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:36.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:36.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1debc17cf7264cbcab30870af5d9c778 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:13:36.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1debc17cf7264cbcab30870af5d9c778 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:13:36.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1debc17cf7264cbcab30870af5d9c778 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:13:36.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:36.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:36.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:13:36.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:36.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:13:36.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:36.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:13:36.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:13:36.812 143779 DEBUG oslo_concurrency.lockutils [req-00ccc323-b9bc-4602-91e9-8f8188ace5cc - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:13:36.812 143780 DEBUG oslo_concurrency.lockutils [req-00ccc323-b9bc-4602-91e9-8f8188ace5cc - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:13:36.812 143787 DEBUG oslo_concurrency.lockutils [req-00ccc323-b9bc-4602-91e9-8f8188ace5cc - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:13:36.813 143779 DEBUG nova.scheduler.host_manager [req-00ccc323-b9bc-4602-91e9-8f8188ace5cc - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:13:36.813 143780 DEBUG nova.scheduler.host_manager [req-00ccc323-b9bc-4602-91e9-8f8188ace5cc - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:13:36.813 143781 DEBUG oslo_concurrency.lockutils [req-00ccc323-b9bc-4602-91e9-8f8188ace5cc - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:13:36.813 143787 DEBUG nova.scheduler.host_manager [req-00ccc323-b9bc-4602-91e9-8f8188ace5cc - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:13:36.813 143779 DEBUG oslo_concurrency.lockutils [req-00ccc323-b9bc-4602-91e9-8f8188ace5cc - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:13:36.813 143780 DEBUG oslo_concurrency.lockutils [req-00ccc323-b9bc-4602-91e9-8f8188ace5cc - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:13:36.813 143781 DEBUG nova.scheduler.host_manager [req-00ccc323-b9bc-4602-91e9-8f8188ace5cc - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:13:36.813 143787 DEBUG oslo_concurrency.lockutils [req-00ccc323-b9bc-4602-91e9-8f8188ace5cc - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:13:36.813 143781 DEBUG oslo_concurrency.lockutils [req-00ccc323-b9bc-4602-91e9-8f8188ace5cc - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:13:36.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:13:36.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:36.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:13:36.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:13:36.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:36.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:13:36.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:13:36.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:36.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:13:36.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:13:36.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:36.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:13:37.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:13:37.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:37.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:13:37.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:13:37.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:13:37.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:37.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:13:37.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:37.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:13:37.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:13:37.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:37.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:13:39.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:13:39.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:13:39.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:39.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:39.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:13:39.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:13:39.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:13:39.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:39.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:13:39.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:13:39.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:39.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:13:43.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:13:43.821 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:43.821 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:13:43.821 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:13:43.821 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:13:43.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:43.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:43.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:13:43.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:13:43.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:13:43.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:43.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:13:46.095 143779 DEBUG oslo_service.periodic_task [req-4680678b-39a8-42ac-9e82-c36411e80715 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:13:46.099 143779 DEBUG oslo_concurrency.lockutils [req-ab798a07-3656-406f-9693-2dc7b657079d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:13:46.099 143779 DEBUG oslo_concurrency.lockutils [req-ab798a07-3656-406f-9693-2dc7b657079d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:13:47.045 143780 DEBUG oslo_service.periodic_task [req-b6096355-0f19-4360-8fb3-7ba3964aed3d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:13:47.049 143780 DEBUG oslo_concurrency.lockutils [req-dc8a7d54-2ad5-46d2-8b90-976e0e11f274 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:13:47.049 143780 DEBUG oslo_concurrency.lockutils [req-dc8a7d54-2ad5-46d2-8b90-976e0e11f274 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:13:51.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:13:51.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:51.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:13:51.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:13:51.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:51.829 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:13:51.829 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:13:51.830 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:51.830 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:13:51.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:13:51.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:13:51.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:13:56.105 143787 DEBUG oslo_service.periodic_task [req-2e813997-7e22-4d35-a2c7-93ce453074ca - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:13:56.108 143787 DEBUG oslo_concurrency.lockutils [req-5b0458d0-caea-4e0e-bf47-031b8bbe6ba1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:13:56.109 143787 DEBUG oslo_concurrency.lockutils [req-5b0458d0-caea-4e0e-bf47-031b8bbe6ba1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:14:06.063 143781 DEBUG oslo_service.periodic_task [req-366b5110-448d-4472-83d1-0c8928c15e93 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:14:06.068 143781 DEBUG oslo_concurrency.lockutils [req-5b2b745c-576c-477d-8381-c88729a52d2c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:14:06.068 143781 DEBUG oslo_concurrency.lockutils [req-5b2b745c-576c-477d-8381-c88729a52d2c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:14:07.826 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:14:07.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:14:07.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:14:07.831 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:14:07.831 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:14:07.832 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:14:07.833 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:14:07.833 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:14:07.833 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:14:07.835 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:14:07.836 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:14:07.836 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:14:16.105 143779 DEBUG oslo_service.periodic_task [req-ab798a07-3656-406f-9693-2dc7b657079d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:14:16.109 143779 DEBUG oslo_concurrency.lockutils [req-e83d27c3-ea9a-486b-9222-3730ee7487ac - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:14:16.109 143779 DEBUG oslo_concurrency.lockutils [req-e83d27c3-ea9a-486b-9222-3730ee7487ac - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:14:17.056 143780 DEBUG oslo_service.periodic_task [req-dc8a7d54-2ad5-46d2-8b90-976e0e11f274 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:14:17.061 143780 DEBUG oslo_concurrency.lockutils [req-e376a240-a489-4049-a6a5-8fd27c5f70b4 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:14:17.061 143780 DEBUG oslo_concurrency.lockutils [req-e376a240-a489-4049-a6a5-8fd27c5f70b4 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:14:26.114 143787 DEBUG oslo_service.periodic_task [req-5b0458d0-caea-4e0e-bf47-031b8bbe6ba1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:14:26.118 143787 DEBUG oslo_concurrency.lockutils [req-b201f1f4-0b7e-4225-afc5-8919cba938c3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:14:26.118 143787 DEBUG oslo_concurrency.lockutils [req-b201f1f4-0b7e-4225-afc5-8919cba938c3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:14:36.074 143781 DEBUG oslo_service.periodic_task [req-5b2b745c-576c-477d-8381-c88729a52d2c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:14:36.078 143781 DEBUG oslo_concurrency.lockutils [req-3a3d21b6-f904-4f2f-b1dc-1d086efe953c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:14:36.078 143781 DEBUG oslo_concurrency.lockutils [req-3a3d21b6-f904-4f2f-b1dc-1d086efe953c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:14:47.058 143779 DEBUG oslo_service.periodic_task [req-e83d27c3-ea9a-486b-9222-3730ee7487ac - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:14:47.062 143779 DEBUG oslo_concurrency.lockutils [req-a4b85371-5e37-4d95-9674-ced0b8bd128d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:14:47.062 143779 DEBUG oslo_concurrency.lockutils [req-a4b85371-5e37-4d95-9674-ced0b8bd128d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:14:47.068 143780 DEBUG oslo_service.periodic_task [req-e376a240-a489-4049-a6a5-8fd27c5f70b4 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:14:47.072 143780 DEBUG oslo_concurrency.lockutils [req-8f4109e0-391b-438a-8e7c-5c56188d74f5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:14:47.072 143780 DEBUG oslo_concurrency.lockutils [req-8f4109e0-391b-438a-8e7c-5c56188d74f5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:14:53.095 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:14:53.095 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:14:53.096 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:14:53.097 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:14:53.097 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:14:53.097 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:14:53.109 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:14:53.110 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:14:53.110 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:14:53.140 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:14:53.141 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:14:53.141 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:14:57.083 143787 DEBUG oslo_service.periodic_task [req-b201f1f4-0b7e-4225-afc5-8919cba938c3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:14:57.087 143787 DEBUG oslo_concurrency.lockutils [req-b09191bb-917a-4671-b4f8-ac76be459c5b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:14:57.087 143787 DEBUG oslo_concurrency.lockutils [req-b09191bb-917a-4671-b4f8-ac76be459c5b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:15:06.084 143781 DEBUG oslo_service.periodic_task [req-3a3d21b6-f904-4f2f-b1dc-1d086efe953c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:15:06.088 143781 DEBUG oslo_concurrency.lockutils [req-2b7cca2b-456a-46d2-aa4d-6f7d25fb5110 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:15:06.088 143781 DEBUG oslo_concurrency.lockutils [req-2b7cca2b-456a-46d2-aa4d-6f7d25fb5110 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:15:17.068 143779 DEBUG oslo_service.periodic_task [req-a4b85371-5e37-4d95-9674-ced0b8bd128d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:15:17.072 143779 DEBUG oslo_concurrency.lockutils [req-abbc71fd-91a1-4413-8dea-c2ccc74430d8 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:15:17.072 143779 DEBUG oslo_concurrency.lockutils [req-abbc71fd-91a1-4413-8dea-c2ccc74430d8 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:15:17.079 143780 DEBUG oslo_service.periodic_task [req-8f4109e0-391b-438a-8e7c-5c56188d74f5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:15:17.082 143780 DEBUG oslo_concurrency.lockutils [req-f8ac7b44-9954-43ab-ac10-3d52574ba5cc - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:15:17.082 143780 DEBUG oslo_concurrency.lockutils [req-f8ac7b44-9954-43ab-ac10-3d52574ba5cc - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:15:27.092 143787 DEBUG oslo_service.periodic_task [req-b09191bb-917a-4671-b4f8-ac76be459c5b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:15:27.096 143787 DEBUG oslo_concurrency.lockutils [req-b7f6b8f8-920a-4525-af36-7f730b4b9c8f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:15:27.096 143787 DEBUG oslo_concurrency.lockutils [req-b7f6b8f8-920a-4525-af36-7f730b4b9c8f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:15:36.094 143781 DEBUG oslo_service.periodic_task [req-2b7cca2b-456a-46d2-aa4d-6f7d25fb5110 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:15:36.098 143781 DEBUG oslo_concurrency.lockutils [req-15daadae-3631-4bca-87c3-3e2242374ddd - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:15:36.099 143781 DEBUG oslo_concurrency.lockutils [req-15daadae-3631-4bca-87c3-3e2242374ddd - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:15:38.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d222d83d820d41bab09b95c67952d2ab __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:15:38.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d222d83d820d41bab09b95c67952d2ab __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:15:38.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d222d83d820d41bab09b95c67952d2ab __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:15:38.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d222d83d820d41bab09b95c67952d2ab __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:15:38.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:38.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:38.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:38.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:38.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d222d83d820d41bab09b95c67952d2ab poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:15:38.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d222d83d820d41bab09b95c67952d2ab poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:15:38.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d222d83d820d41bab09b95c67952d2ab poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:15:38.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d222d83d820d41bab09b95c67952d2ab poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:15:38.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:38.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:38.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:15:38.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:38.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:15:38.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:38.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:15:38.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:15:38.812 143781 DEBUG oslo_concurrency.lockutils [req-e4cc83fe-31ae-4ebf-bf9d-ac437edff466 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:15:38.812 143787 DEBUG oslo_concurrency.lockutils [req-e4cc83fe-31ae-4ebf-bf9d-ac437edff466 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:15:38.812 143780 DEBUG oslo_concurrency.lockutils [req-e4cc83fe-31ae-4ebf-bf9d-ac437edff466 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:15:38.812 143779 DEBUG oslo_concurrency.lockutils [req-e4cc83fe-31ae-4ebf-bf9d-ac437edff466 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:15:38.813 143781 DEBUG nova.scheduler.host_manager [req-e4cc83fe-31ae-4ebf-bf9d-ac437edff466 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:15:38.813 143787 DEBUG nova.scheduler.host_manager [req-e4cc83fe-31ae-4ebf-bf9d-ac437edff466 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:15:38.813 143779 DEBUG nova.scheduler.host_manager [req-e4cc83fe-31ae-4ebf-bf9d-ac437edff466 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:15:38.813 143780 DEBUG nova.scheduler.host_manager [req-e4cc83fe-31ae-4ebf-bf9d-ac437edff466 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:15:38.813 143781 DEBUG oslo_concurrency.lockutils [req-e4cc83fe-31ae-4ebf-bf9d-ac437edff466 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:15:38.813 143787 DEBUG oslo_concurrency.lockutils [req-e4cc83fe-31ae-4ebf-bf9d-ac437edff466 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:15:38.813 143780 DEBUG oslo_concurrency.lockutils [req-e4cc83fe-31ae-4ebf-bf9d-ac437edff466 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:15:38.813 143779 DEBUG oslo_concurrency.lockutils [req-e4cc83fe-31ae-4ebf-bf9d-ac437edff466 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:15:38.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:15:38.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:15:38.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:38.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:38.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:15:38.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:15:38.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:15:38.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:38.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:15:38.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:15:38.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:38.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:15:39.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:15:39.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:39.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:15:39.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:15:39.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:15:39.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:15:39.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:39.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:39.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:39.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:15:39.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:15:39.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:15:41.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:15:41.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:41.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:15:41.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:15:41.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:15:41.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:15:41.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:41.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:41.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:41.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:15:41.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:15:41.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:15:45.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:15:45.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:45.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:15:45.819 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:15:45.819 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:45.819 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:15:45.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:15:45.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:45.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:15:45.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:15:45.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:45.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:15:47.077 143779 DEBUG oslo_service.periodic_task [req-abbc71fd-91a1-4413-8dea-c2ccc74430d8 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:15:47.081 143779 DEBUG oslo_concurrency.lockutils [req-a66a9c79-2cc2-46b1-8a18-ed5584a8a830 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:15:47.081 143779 DEBUG oslo_concurrency.lockutils [req-a66a9c79-2cc2-46b1-8a18-ed5584a8a830 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:15:48.044 143780 DEBUG oslo_service.periodic_task [req-f8ac7b44-9954-43ab-ac10-3d52574ba5cc - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:15:48.048 143780 DEBUG oslo_concurrency.lockutils [req-de80a08c-beea-4949-883f-55c3fa77f417 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:15:48.048 143780 DEBUG oslo_concurrency.lockutils [req-de80a08c-beea-4949-883f-55c3fa77f417 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:15:53.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:15:53.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:53.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:15:53.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:15:53.821 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:53.821 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:15:53.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:15:53.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:53.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:15:53.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:15:53.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:15:53.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:15:57.104 143787 DEBUG oslo_service.periodic_task [req-b7f6b8f8-920a-4525-af36-7f730b4b9c8f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:15:57.108 143787 DEBUG oslo_concurrency.lockutils [req-99742d31-732f-4a3a-8d76-19a8e71fa0d2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:15:57.108 143787 DEBUG oslo_concurrency.lockutils [req-99742d31-732f-4a3a-8d76-19a8e71fa0d2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:16:06.105 143781 DEBUG oslo_service.periodic_task [req-15daadae-3631-4bca-87c3-3e2242374ddd - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:16:06.109 143781 DEBUG oslo_concurrency.lockutils [req-07c04a9e-1112-4447-9345-f92568412f25 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:16:06.109 143781 DEBUG oslo_concurrency.lockutils [req-07c04a9e-1112-4447-9345-f92568412f25 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:16:09.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:16:09.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:16:09.823 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:16:09.823 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:16:09.823 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:16:09.823 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:16:09.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:16:09.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:16:09.824 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:16:09.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:16:09.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:16:09.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:16:17.088 143779 DEBUG oslo_service.periodic_task [req-a66a9c79-2cc2-46b1-8a18-ed5584a8a830 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:16:17.092 143779 DEBUG oslo_concurrency.lockutils [req-7975d3d7-80bd-4d26-9fa0-19aaa7debedd - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:16:17.092 143779 DEBUG oslo_concurrency.lockutils [req-7975d3d7-80bd-4d26-9fa0-19aaa7debedd - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:16:19.045 143780 DEBUG oslo_service.periodic_task [req-de80a08c-beea-4949-883f-55c3fa77f417 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:16:19.049 143780 DEBUG oslo_concurrency.lockutils [req-a8ff9a82-0838-4dbd-af23-cf104cb45607 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:16:19.049 143780 DEBUG oslo_concurrency.lockutils [req-a8ff9a82-0838-4dbd-af23-cf104cb45607 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:16:27.114 143787 DEBUG oslo_service.periodic_task [req-99742d31-732f-4a3a-8d76-19a8e71fa0d2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:16:27.119 143787 DEBUG oslo_concurrency.lockutils [req-54744254-83b3-48ef-800d-29018b1c290f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:16:27.119 143787 DEBUG oslo_concurrency.lockutils [req-54744254-83b3-48ef-800d-29018b1c290f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:16:36.115 143781 DEBUG oslo_service.periodic_task [req-07c04a9e-1112-4447-9345-f92568412f25 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:16:36.119 143781 DEBUG oslo_concurrency.lockutils [req-39392b49-3804-4335-bfe3-966d01e35f0e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:16:36.119 143781 DEBUG oslo_concurrency.lockutils [req-39392b49-3804-4335-bfe3-966d01e35f0e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:16:47.099 143779 DEBUG oslo_service.periodic_task [req-7975d3d7-80bd-4d26-9fa0-19aaa7debedd - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:16:47.102 143779 DEBUG oslo_concurrency.lockutils [req-37d80548-74a9-4032-a681-06f43677284c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:16:47.103 143779 DEBUG oslo_concurrency.lockutils [req-37d80548-74a9-4032-a681-06f43677284c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:16:49.057 143780 DEBUG oslo_service.periodic_task [req-a8ff9a82-0838-4dbd-af23-cf104cb45607 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:16:49.061 143780 DEBUG oslo_concurrency.lockutils [req-8edb9e64-3a90-4ba3-a356-c5993db0495a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:16:49.062 143780 DEBUG oslo_concurrency.lockutils [req-8edb9e64-3a90-4ba3-a356-c5993db0495a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:16:53.097 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:16:53.097 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:16:53.097 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:16:53.099 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:16:53.099 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:16:53.099 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:16:53.113 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:16:53.114 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:16:53.114 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:16:53.146 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:16:53.146 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:16:53.146 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:16:57.125 143787 DEBUG oslo_service.periodic_task [req-54744254-83b3-48ef-800d-29018b1c290f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:16:57.129 143787 DEBUG oslo_concurrency.lockutils [req-26d673b1-82d4-44ae-b5b8-83f272ba4611 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:16:57.129 143787 DEBUG oslo_concurrency.lockutils [req-26d673b1-82d4-44ae-b5b8-83f272ba4611 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:17:06.123 143781 DEBUG oslo_service.periodic_task [req-39392b49-3804-4335-bfe3-966d01e35f0e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:17:06.127 143781 DEBUG oslo_concurrency.lockutils [req-12ff5cc1-9134-4dac-849a-8fd8c50d63a2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:17:06.128 143781 DEBUG oslo_concurrency.lockutils [req-12ff5cc1-9134-4dac-849a-8fd8c50d63a2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:17:17.107 143779 DEBUG oslo_service.periodic_task [req-37d80548-74a9-4032-a681-06f43677284c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:17:17.111 143779 DEBUG oslo_concurrency.lockutils [req-f00927a2-547d-4131-95b1-9305693bb8cb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:17:17.111 143779 DEBUG oslo_concurrency.lockutils [req-f00927a2-547d-4131-95b1-9305693bb8cb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:17:19.070 143780 DEBUG oslo_service.periodic_task [req-8edb9e64-3a90-4ba3-a356-c5993db0495a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:17:19.074 143780 DEBUG oslo_concurrency.lockutils [req-79ce2807-9f13-4694-b22a-3b8a6cf0a480 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:17:19.074 143780 DEBUG oslo_concurrency.lockutils [req-79ce2807-9f13-4694-b22a-3b8a6cf0a480 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:17:27.136 143787 DEBUG oslo_service.periodic_task [req-26d673b1-82d4-44ae-b5b8-83f272ba4611 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:17:27.140 143787 DEBUG oslo_concurrency.lockutils [req-581dfcee-64e6-4d24-9d33-178339a7ec06 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:17:27.140 143787 DEBUG oslo_concurrency.lockutils [req-581dfcee-64e6-4d24-9d33-178339a7ec06 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:17:36.133 143781 DEBUG oslo_service.periodic_task [req-12ff5cc1-9134-4dac-849a-8fd8c50d63a2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:17:36.137 143781 DEBUG oslo_concurrency.lockutils [req-af1fd0bb-c61b-42ee-9afc-cc60c9409a52 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:17:36.137 143781 DEBUG oslo_concurrency.lockutils [req-af1fd0bb-c61b-42ee-9afc-cc60c9409a52 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:17:42.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 435bad2a9e0e400da5d625926c86553b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:17:42.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 435bad2a9e0e400da5d625926c86553b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:17:42.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 435bad2a9e0e400da5d625926c86553b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:17:42.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:42.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 435bad2a9e0e400da5d625926c86553b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:17:42.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:42.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:42.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 435bad2a9e0e400da5d625926c86553b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:17:42.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:42.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 435bad2a9e0e400da5d625926c86553b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:17:42.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 435bad2a9e0e400da5d625926c86553b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:17:42.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:17:42.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:42.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:42.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:42.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:17:42.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:17:42.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 435bad2a9e0e400da5d625926c86553b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:17:42.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:42.813 143780 DEBUG oslo_concurrency.lockutils [req-b83b11eb-638f-4c3b-9824-26007d246051 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:17:42.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:17:42.813 143780 DEBUG nova.scheduler.host_manager [req-b83b11eb-638f-4c3b-9824-26007d246051 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:17:42.813 143787 DEBUG oslo_concurrency.lockutils [req-b83b11eb-638f-4c3b-9824-26007d246051 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:17:42.813 143781 DEBUG oslo_concurrency.lockutils [req-b83b11eb-638f-4c3b-9824-26007d246051 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:17:42.813 143787 DEBUG nova.scheduler.host_manager [req-b83b11eb-638f-4c3b-9824-26007d246051 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:17:42.813 143780 DEBUG oslo_concurrency.lockutils [req-b83b11eb-638f-4c3b-9824-26007d246051 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:17:42.813 143781 DEBUG nova.scheduler.host_manager [req-b83b11eb-638f-4c3b-9824-26007d246051 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:17:42.813 143787 DEBUG oslo_concurrency.lockutils [req-b83b11eb-638f-4c3b-9824-26007d246051 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:17:42.813 143781 DEBUG oslo_concurrency.lockutils [req-b83b11eb-638f-4c3b-9824-26007d246051 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:17:42.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:17:42.814 143779 DEBUG oslo_concurrency.lockutils [req-b83b11eb-638f-4c3b-9824-26007d246051 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:17:42.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:42.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:17:42.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:17:42.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:17:42.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:42.814 143779 DEBUG nova.scheduler.host_manager [req-b83b11eb-638f-4c3b-9824-26007d246051 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:17:42.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:42.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:17:42.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:17:42.814 143779 DEBUG oslo_concurrency.lockutils [req-b83b11eb-638f-4c3b-9824-26007d246051 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:17:42.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:17:42.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:42.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:17:43.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:17:43.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:17:43.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:43.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:43.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:17:43.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:17:43.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:17:43.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:17:43.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:43.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:43.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:17:43.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:17:45.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:17:45.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:45.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:17:45.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:17:45.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:17:45.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:45.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:45.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:17:45.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:17:45.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:17:45.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:45.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:17:48.058 143779 DEBUG oslo_service.periodic_task [req-f00927a2-547d-4131-95b1-9305693bb8cb - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:17:48.062 143779 DEBUG oslo_concurrency.lockutils [req-c4230190-959c-44bc-815f-197353547bb7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:17:48.062 143779 DEBUG oslo_concurrency.lockutils [req-c4230190-959c-44bc-815f-197353547bb7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:17:49.081 143780 DEBUG oslo_service.periodic_task [req-79ce2807-9f13-4694-b22a-3b8a6cf0a480 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:17:49.086 143780 DEBUG oslo_concurrency.lockutils [req-df92cb38-9c82-4fb8-a2de-34982c09f81f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:17:49.086 143780 DEBUG oslo_concurrency.lockutils [req-df92cb38-9c82-4fb8-a2de-34982c09f81f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:17:49.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:17:49.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:17:49.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:49.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:17:49.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:49.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:17:49.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:17:49.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:49.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:17:49.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:17:49.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:49.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:17:57.146 143787 DEBUG oslo_service.periodic_task [req-581dfcee-64e6-4d24-9d33-178339a7ec06 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:17:57.150 143787 DEBUG oslo_concurrency.lockutils [req-21b7ff19-60dc-477a-b743-e2166223a48d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:17:57.150 143787 DEBUG oslo_concurrency.lockutils [req-21b7ff19-60dc-477a-b743-e2166223a48d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:17:57.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:17:57.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:17:57.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:57.824 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:17:57.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:57.825 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:17:57.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:57.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:17:57.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:17:57.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:17:57.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:17:57.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:18:06.141 143781 DEBUG oslo_service.periodic_task [req-af1fd0bb-c61b-42ee-9afc-cc60c9409a52 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:18:06.145 143781 DEBUG oslo_concurrency.lockutils [req-6dc0b4f8-5eac-40d8-9281-321da890ddc9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:18:06.146 143781 DEBUG oslo_concurrency.lockutils [req-6dc0b4f8-5eac-40d8-9281-321da890ddc9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:18:13.826 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:18:13.826 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:18:13.827 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:18:13.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:18:13.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:18:13.827 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:18:13.827 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:18:13.827 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:18:13.827 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:18:13.829 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:18:13.830 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:18:13.830 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:18:18.070 143779 DEBUG oslo_service.periodic_task [req-c4230190-959c-44bc-815f-197353547bb7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:18:18.074 143779 DEBUG oslo_concurrency.lockutils [req-edb5ab47-27c0-4ead-a18b-75c6ceb3172b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:18:18.074 143779 DEBUG oslo_concurrency.lockutils [req-edb5ab47-27c0-4ead-a18b-75c6ceb3172b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:18:19.093 143780 DEBUG oslo_service.periodic_task [req-df92cb38-9c82-4fb8-a2de-34982c09f81f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:18:19.098 143780 DEBUG oslo_concurrency.lockutils [req-8c4e974e-0dc6-4581-852e-165cf1c8d76e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:18:19.098 143780 DEBUG oslo_concurrency.lockutils [req-8c4e974e-0dc6-4581-852e-165cf1c8d76e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:18:27.157 143787 DEBUG oslo_service.periodic_task [req-21b7ff19-60dc-477a-b743-e2166223a48d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:18:27.161 143787 DEBUG oslo_concurrency.lockutils [req-d4d909da-1757-4296-93d3-cb293c845185 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:18:27.161 143787 DEBUG oslo_concurrency.lockutils [req-d4d909da-1757-4296-93d3-cb293c845185 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:18:36.151 143781 DEBUG oslo_service.periodic_task [req-6dc0b4f8-5eac-40d8-9281-321da890ddc9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:18:36.155 143781 DEBUG oslo_concurrency.lockutils [req-03aa5529-90f9-4748-b172-c333a667ef9c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:18:36.155 143781 DEBUG oslo_concurrency.lockutils [req-03aa5529-90f9-4748-b172-c333a667ef9c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:18:49.057 143779 DEBUG oslo_service.periodic_task [req-edb5ab47-27c0-4ead-a18b-75c6ceb3172b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:18:49.062 143779 DEBUG oslo_concurrency.lockutils [req-88353ceb-f7b0-42ac-aaf6-07f76d080bed - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:18:49.062 143779 DEBUG oslo_concurrency.lockutils [req-88353ceb-f7b0-42ac-aaf6-07f76d080bed - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:18:49.107 143780 DEBUG oslo_service.periodic_task [req-8c4e974e-0dc6-4581-852e-165cf1c8d76e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:18:49.111 143780 DEBUG oslo_concurrency.lockutils [req-45289954-242b-4efd-89d8-6b39ee1723fd - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:18:49.111 143780 DEBUG oslo_concurrency.lockutils [req-45289954-242b-4efd-89d8-6b39ee1723fd - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:18:53.101 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:18:53.102 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:18:53.102 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:18:53.102 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:18:53.103 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:18:53.103 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:18:53.116 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:18:53.116 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:18:53.117 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:18:53.149 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:18:53.150 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:18:53.150 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:18:57.168 143787 DEBUG oslo_service.periodic_task [req-d4d909da-1757-4296-93d3-cb293c845185 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:18:57.171 143787 DEBUG oslo_concurrency.lockutils [req-83cee720-5c21-4ee1-b411-7f34afd97609 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:18:57.172 143787 DEBUG oslo_concurrency.lockutils [req-83cee720-5c21-4ee1-b411-7f34afd97609 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:19:07.042 143781 DEBUG oslo_service.periodic_task [req-03aa5529-90f9-4748-b172-c333a667ef9c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:19:07.046 143781 DEBUG oslo_concurrency.lockutils [req-0e320781-12d4-4d99-bf83-57052d5021da - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:19:07.046 143781 DEBUG oslo_concurrency.lockutils [req-0e320781-12d4-4d99-bf83-57052d5021da - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:19:19.070 143779 DEBUG oslo_service.periodic_task [req-88353ceb-f7b0-42ac-aaf6-07f76d080bed - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:19:19.074 143779 DEBUG oslo_concurrency.lockutils [req-144e4b62-2d08-46de-8fb9-dc2b2d842ec6 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:19:19.075 143779 DEBUG oslo_concurrency.lockutils [req-144e4b62-2d08-46de-8fb9-dc2b2d842ec6 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:19:19.120 143780 DEBUG oslo_service.periodic_task [req-45289954-242b-4efd-89d8-6b39ee1723fd - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:19:19.124 143780 DEBUG oslo_concurrency.lockutils [req-077a24be-6bcd-4089-81c9-083f15dc991f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:19:19.124 143780 DEBUG oslo_concurrency.lockutils [req-077a24be-6bcd-4089-81c9-083f15dc991f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:19:28.083 143787 DEBUG oslo_service.periodic_task [req-83cee720-5c21-4ee1-b411-7f34afd97609 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:19:28.087 143787 DEBUG oslo_concurrency.lockutils [req-b12e7d28-0a0d-4746-9c3b-17bc330ce3c2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:19:28.087 143787 DEBUG oslo_concurrency.lockutils [req-b12e7d28-0a0d-4746-9c3b-17bc330ce3c2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:19:37.052 143781 DEBUG oslo_service.periodic_task [req-0e320781-12d4-4d99-bf83-57052d5021da - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:19:37.056 143781 DEBUG oslo_concurrency.lockutils [req-86fdccd7-66d6-445e-83f3-689e0861580f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:19:37.056 143781 DEBUG oslo_concurrency.lockutils [req-86fdccd7-66d6-445e-83f3-689e0861580f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:19:43.810 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: bc4383a8833049069a830c435ad19186 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:19:43.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: bc4383a8833049069a830c435ad19186 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:19:43.810 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: bc4383a8833049069a830c435ad19186 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:19:43.810 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: bc4383a8833049069a830c435ad19186 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:19:43.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:43.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:43.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:43.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:43.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: bc4383a8833049069a830c435ad19186 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:19:43.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: bc4383a8833049069a830c435ad19186 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:19:43.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: bc4383a8833049069a830c435ad19186 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:19:43.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: bc4383a8833049069a830c435ad19186 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:19:43.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:43.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:43.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:43.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:43.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:19:43.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:19:43.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:19:43.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:19:43.812 143787 DEBUG oslo_concurrency.lockutils [req-c05b78e0-7756-4725-a565-46bc1ed8a644 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:19:43.812 143781 DEBUG oslo_concurrency.lockutils [req-c05b78e0-7756-4725-a565-46bc1ed8a644 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:19:43.812 143779 DEBUG oslo_concurrency.lockutils [req-c05b78e0-7756-4725-a565-46bc1ed8a644 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:19:43.812 143780 DEBUG oslo_concurrency.lockutils [req-c05b78e0-7756-4725-a565-46bc1ed8a644 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:19:43.812 143787 DEBUG nova.scheduler.host_manager [req-c05b78e0-7756-4725-a565-46bc1ed8a644 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:19:43.812 143781 DEBUG nova.scheduler.host_manager [req-c05b78e0-7756-4725-a565-46bc1ed8a644 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:19:43.812 143779 DEBUG nova.scheduler.host_manager [req-c05b78e0-7756-4725-a565-46bc1ed8a644 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:19:43.812 143780 DEBUG nova.scheduler.host_manager [req-c05b78e0-7756-4725-a565-46bc1ed8a644 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:19:43.812 143787 DEBUG oslo_concurrency.lockutils [req-c05b78e0-7756-4725-a565-46bc1ed8a644 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:19:43.812 143781 DEBUG oslo_concurrency.lockutils [req-c05b78e0-7756-4725-a565-46bc1ed8a644 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:19:43.812 143779 DEBUG oslo_concurrency.lockutils [req-c05b78e0-7756-4725-a565-46bc1ed8a644 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:19:43.812 143780 DEBUG oslo_concurrency.lockutils [req-c05b78e0-7756-4725-a565-46bc1ed8a644 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:19:43.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:19:43.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:43.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:19:43.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:19:43.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:19:43.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:19:43.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:43.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:43.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:43.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:19:43.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:19:43.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:19:44.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:19:44.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:44.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:19:44.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:19:44.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:44.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:19:44.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:19:44.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:44.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:19:44.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:19:44.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:44.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:19:46.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:19:46.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:46.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:19:46.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:19:46.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:19:46.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:46.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:19:46.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:19:46.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:46.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:19:46.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:46.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:19:49.081 143779 DEBUG oslo_service.periodic_task [req-144e4b62-2d08-46de-8fb9-dc2b2d842ec6 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:19:49.085 143779 DEBUG oslo_concurrency.lockutils [req-17347140-ce01-4339-ade2-1a2054b1aad5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:19:49.086 143779 DEBUG oslo_concurrency.lockutils [req-17347140-ce01-4339-ade2-1a2054b1aad5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:19:50.044 143780 DEBUG oslo_service.periodic_task [req-077a24be-6bcd-4089-81c9-083f15dc991f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:19:50.049 143780 DEBUG oslo_concurrency.lockutils [req-aa3c3f19-7f5c-486a-bfff-f5f23f5eaa50 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:19:50.049 143780 DEBUG oslo_concurrency.lockutils [req-aa3c3f19-7f5c-486a-bfff-f5f23f5eaa50 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:19:50.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:19:50.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:19:50.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:50.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:50.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:19:50.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:19:50.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:19:50.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:50.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:19:50.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:19:50.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:50.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:19:58.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:19:58.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:58.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:19:58.825 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:19:58.825 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:58.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:19:58.825 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:19:58.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:58.826 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:19:58.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:19:58.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:19:58.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:19:59.083 143787 DEBUG oslo_service.periodic_task [req-b12e7d28-0a0d-4746-9c3b-17bc330ce3c2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:19:59.087 143787 DEBUG oslo_concurrency.lockutils [req-842baf29-2516-4220-bb00-cd3f35744a18 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:19:59.087 143787 DEBUG oslo_concurrency.lockutils [req-842baf29-2516-4220-bb00-cd3f35744a18 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:20:07.063 143781 DEBUG oslo_service.periodic_task [req-86fdccd7-66d6-445e-83f3-689e0861580f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:20:07.067 143781 DEBUG oslo_concurrency.lockutils [req-f66b95d9-89a9-4b21-a1b4-bbf320a04122 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:20:07.067 143781 DEBUG oslo_concurrency.lockutils [req-f66b95d9-89a9-4b21-a1b4-bbf320a04122 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:20:14.826 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:20:14.826 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:20:14.826 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:20:14.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:20:14.828 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:20:14.828 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:20:14.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:20:14.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:20:14.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:20:14.829 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:20:14.830 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:20:14.830 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:20:20.057 143779 DEBUG oslo_service.periodic_task [req-17347140-ce01-4339-ade2-1a2054b1aad5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:20:20.061 143779 DEBUG oslo_concurrency.lockutils [req-6aaa0b0e-1411-4cb4-81e8-71ca385ea2dd - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:20:20.062 143779 DEBUG oslo_concurrency.lockutils [req-6aaa0b0e-1411-4cb4-81e8-71ca385ea2dd - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:20:21.044 143780 DEBUG oslo_service.periodic_task [req-aa3c3f19-7f5c-486a-bfff-f5f23f5eaa50 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:20:21.048 143780 DEBUG oslo_concurrency.lockutils [req-8af3c454-49c6-479c-a1dc-fc33415050cd - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:20:21.048 143780 DEBUG oslo_concurrency.lockutils [req-8af3c454-49c6-479c-a1dc-fc33415050cd - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:20:29.096 143787 DEBUG oslo_service.periodic_task [req-842baf29-2516-4220-bb00-cd3f35744a18 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:20:29.100 143787 DEBUG oslo_concurrency.lockutils [req-3b2f8ad3-aa96-483a-8be9-8d6b62cfde35 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:20:29.100 143787 DEBUG oslo_concurrency.lockutils [req-3b2f8ad3-aa96-483a-8be9-8d6b62cfde35 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:20:37.074 143781 DEBUG oslo_service.periodic_task [req-f66b95d9-89a9-4b21-a1b4-bbf320a04122 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:20:37.078 143781 DEBUG oslo_concurrency.lockutils [req-efb084d9-ecb1-4372-b083-9dd76ab8016b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:20:37.078 143781 DEBUG oslo_concurrency.lockutils [req-efb084d9-ecb1-4372-b083-9dd76ab8016b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:20:50.069 143779 DEBUG oslo_service.periodic_task [req-6aaa0b0e-1411-4cb4-81e8-71ca385ea2dd - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:20:50.076 143779 DEBUG oslo_concurrency.lockutils [req-2dcb4301-442a-4887-9e9a-d3f3bb3a670c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:20:50.076 143779 DEBUG oslo_concurrency.lockutils [req-2dcb4301-442a-4887-9e9a-d3f3bb3a670c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:20:51.057 143780 DEBUG oslo_service.periodic_task [req-8af3c454-49c6-479c-a1dc-fc33415050cd - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:20:51.060 143780 DEBUG oslo_concurrency.lockutils [req-7723905e-472b-4c81-adc3-2eb61ea2b02e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:20:51.061 143780 DEBUG oslo_concurrency.lockutils [req-7723905e-472b-4c81-adc3-2eb61ea2b02e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:20:53.104 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:20:53.105 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:20:53.105 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:20:53.107 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:20:53.107 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:20:53.107 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:20:53.117 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:20:53.118 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:20:53.118 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:20:53.153 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:20:53.154 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:20:53.154 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:20:59.108 143787 DEBUG oslo_service.periodic_task [req-3b2f8ad3-aa96-483a-8be9-8d6b62cfde35 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:20:59.112 143787 DEBUG oslo_concurrency.lockutils [req-9179e155-18d1-449f-ad2d-c7538b85724d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:20:59.112 143787 DEBUG oslo_concurrency.lockutils [req-9179e155-18d1-449f-ad2d-c7538b85724d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:21:07.085 143781 DEBUG oslo_service.periodic_task [req-efb084d9-ecb1-4372-b083-9dd76ab8016b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:21:07.089 143781 DEBUG oslo_concurrency.lockutils [req-a7041892-de54-4475-b8e0-746755579ca0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:21:07.090 143781 DEBUG oslo_concurrency.lockutils [req-a7041892-de54-4475-b8e0-746755579ca0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:21:20.086 143779 DEBUG oslo_service.periodic_task [req-2dcb4301-442a-4887-9e9a-d3f3bb3a670c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:21:20.090 143779 DEBUG oslo_concurrency.lockutils [req-e02ab92e-cd80-4aef-8957-33d3fe29d96c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:21:20.090 143779 DEBUG oslo_concurrency.lockutils [req-e02ab92e-cd80-4aef-8957-33d3fe29d96c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:21:21.069 143780 DEBUG oslo_service.periodic_task [req-7723905e-472b-4c81-adc3-2eb61ea2b02e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:21:21.074 143780 DEBUG oslo_concurrency.lockutils [req-288cc44b-6b93-42eb-a670-23c1f5f246c7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:21:21.074 143780 DEBUG oslo_concurrency.lockutils [req-288cc44b-6b93-42eb-a670-23c1f5f246c7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:21:29.120 143787 DEBUG oslo_service.periodic_task [req-9179e155-18d1-449f-ad2d-c7538b85724d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:21:29.124 143787 DEBUG oslo_concurrency.lockutils [req-3b1c6293-b6c1-4ed1-ac72-530bb93341cb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:21:29.124 143787 DEBUG oslo_concurrency.lockutils [req-3b1c6293-b6c1-4ed1-ac72-530bb93341cb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:21:37.096 143781 DEBUG oslo_service.periodic_task [req-a7041892-de54-4475-b8e0-746755579ca0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:21:37.100 143781 DEBUG oslo_concurrency.lockutils [req-99a075dc-c02a-4de7-870b-7d06945b9c73 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:21:37.100 143781 DEBUG oslo_concurrency.lockutils [req-99a075dc-c02a-4de7-870b-7d06945b9c73 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:21:47.810 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 54bab2da5f3e47a882705c91dee31b9b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:21:47.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 54bab2da5f3e47a882705c91dee31b9b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:21:47.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 54bab2da5f3e47a882705c91dee31b9b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:21:47.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 54bab2da5f3e47a882705c91dee31b9b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:21:47.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:21:47.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:21:47.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:21:47.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 54bab2da5f3e47a882705c91dee31b9b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:21:47.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:21:47.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 54bab2da5f3e47a882705c91dee31b9b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:21:47.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 54bab2da5f3e47a882705c91dee31b9b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:21:47.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 54bab2da5f3e47a882705c91dee31b9b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:21:47.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:21:47.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:21:47.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:21:47.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:21:47.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:21:47.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:21:47.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:21:47.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:21:47.812 143787 DEBUG oslo_concurrency.lockutils [req-8ba186c9-ee56-453e-8ec8-22768ed6382c - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:21:47.812 143781 DEBUG oslo_concurrency.lockutils [req-8ba186c9-ee56-453e-8ec8-22768ed6382c - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:21:47.812 143779 DEBUG oslo_concurrency.lockutils [req-8ba186c9-ee56-453e-8ec8-22768ed6382c - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:21:47.812 143780 DEBUG oslo_concurrency.lockutils [req-8ba186c9-ee56-453e-8ec8-22768ed6382c - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:21:47.812 143787 DEBUG nova.scheduler.host_manager [req-8ba186c9-ee56-453e-8ec8-22768ed6382c - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:21:47.812 143779 DEBUG nova.scheduler.host_manager [req-8ba186c9-ee56-453e-8ec8-22768ed6382c - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:21:47.812 143781 DEBUG nova.scheduler.host_manager [req-8ba186c9-ee56-453e-8ec8-22768ed6382c - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:21:47.812 143780 DEBUG nova.scheduler.host_manager [req-8ba186c9-ee56-453e-8ec8-22768ed6382c - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:21:47.812 143787 DEBUG oslo_concurrency.lockutils [req-8ba186c9-ee56-453e-8ec8-22768ed6382c - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:21:47.812 143781 DEBUG oslo_concurrency.lockutils [req-8ba186c9-ee56-453e-8ec8-22768ed6382c - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:21:47.812 143779 DEBUG oslo_concurrency.lockutils [req-8ba186c9-ee56-453e-8ec8-22768ed6382c - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:21:47.812 143780 DEBUG oslo_concurrency.lockutils [req-8ba186c9-ee56-453e-8ec8-22768ed6382c - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:21:47.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:21:47.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:21:47.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:21:47.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:21:47.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:21:47.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:21:47.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:21:47.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:21:47.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:21:47.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:21:47.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:21:47.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:21:48.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:21:48.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:21:48.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:21:48.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:21:48.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:21:48.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:21:48.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:21:48.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:21:48.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:21:48.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:21:48.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:21:48.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:21:50.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:21:50.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:21:50.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:21:50.819 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:21:50.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:21:50.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:21:50.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:21:50.819 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:21:50.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:21:50.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:21:50.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:21:50.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:21:51.057 143779 DEBUG oslo_service.periodic_task [req-e02ab92e-cd80-4aef-8957-33d3fe29d96c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:21:51.061 143779 DEBUG oslo_concurrency.lockutils [req-0e8c3263-1b0f-415e-8420-96e46eb7b695 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:21:51.061 143779 DEBUG oslo_concurrency.lockutils [req-0e8c3263-1b0f-415e-8420-96e46eb7b695 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:21:51.078 143780 DEBUG oslo_service.periodic_task [req-288cc44b-6b93-42eb-a670-23c1f5f246c7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:21:51.082 143780 DEBUG oslo_concurrency.lockutils [req-76b693e8-f584-4a60-8482-36cc3fc0d5de - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:21:51.083 143780 DEBUG oslo_concurrency.lockutils [req-76b693e8-f584-4a60-8482-36cc3fc0d5de - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:21:54.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:21:54.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:21:54.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:21:54.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:21:54.821 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:21:54.821 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:21:54.821 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:21:54.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:21:54.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:21:54.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:21:54.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:21:54.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:21:59.132 143787 DEBUG oslo_service.periodic_task [req-3b1c6293-b6c1-4ed1-ac72-530bb93341cb - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:21:59.136 143787 DEBUG oslo_concurrency.lockutils [req-6ddc53d9-be6e-446c-98f7-cd75095e40eb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:21:59.137 143787 DEBUG oslo_concurrency.lockutils [req-6ddc53d9-be6e-446c-98f7-cd75095e40eb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:22:02.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:22:02.826 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:22:02.826 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:22:02.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:22:02.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:22:02.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:22:02.829 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:22:02.829 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:22:02.829 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:22:02.830 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:22:02.830 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:22:02.830 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:22:07.105 143781 DEBUG oslo_service.periodic_task [req-99a075dc-c02a-4de7-870b-7d06945b9c73 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:22:07.109 143781 DEBUG oslo_concurrency.lockutils [req-c97608f3-c06b-45d4-a4f8-db0495f14b03 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:22:07.109 143781 DEBUG oslo_concurrency.lockutils [req-c97608f3-c06b-45d4-a4f8-db0495f14b03 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:22:18.827 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:22:18.828 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:22:18.828 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:22:18.829 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:22:18.830 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:22:18.830 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:22:18.831 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:22:18.832 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:22:18.832 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:22:18.832 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:22:18.832 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:22:18.832 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:22:21.089 143780 DEBUG oslo_service.periodic_task [req-76b693e8-f584-4a60-8482-36cc3fc0d5de - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:22:21.093 143780 DEBUG oslo_concurrency.lockutils [req-02de7559-03b4-46fd-88f5-721a5888fb19 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:22:21.094 143780 DEBUG oslo_concurrency.lockutils [req-02de7559-03b4-46fd-88f5-721a5888fb19 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:22:22.057 143779 DEBUG oslo_service.periodic_task [req-0e8c3263-1b0f-415e-8420-96e46eb7b695 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:22:22.061 143779 DEBUG oslo_concurrency.lockutils [req-3a47b4e7-e8cb-4862-967a-584aef19a5fc - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:22:22.062 143779 DEBUG oslo_concurrency.lockutils [req-3a47b4e7-e8cb-4862-967a-584aef19a5fc - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:22:29.144 143787 DEBUG oslo_service.periodic_task [req-6ddc53d9-be6e-446c-98f7-cd75095e40eb - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:22:29.148 143787 DEBUG oslo_concurrency.lockutils [req-1c019881-3f13-408a-8ca3-e7a2e3138ce1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:22:29.148 143787 DEBUG oslo_concurrency.lockutils [req-1c019881-3f13-408a-8ca3-e7a2e3138ce1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:22:37.115 143781 DEBUG oslo_service.periodic_task [req-c97608f3-c06b-45d4-a4f8-db0495f14b03 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:22:37.119 143781 DEBUG oslo_concurrency.lockutils [req-d9b1f992-5157-4f64-bce2-5822456679c8 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:22:37.120 143781 DEBUG oslo_concurrency.lockutils [req-d9b1f992-5157-4f64-bce2-5822456679c8 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:22:51.100 143780 DEBUG oslo_service.periodic_task [req-02de7559-03b4-46fd-88f5-721a5888fb19 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:22:51.104 143780 DEBUG oslo_concurrency.lockutils [req-07503d10-1776-43d6-bba8-5b6b062e5855 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:22:51.104 143780 DEBUG oslo_concurrency.lockutils [req-07503d10-1776-43d6-bba8-5b6b062e5855 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:22:52.067 143779 DEBUG oslo_service.periodic_task [req-3a47b4e7-e8cb-4862-967a-584aef19a5fc - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:22:52.071 143779 DEBUG oslo_concurrency.lockutils [req-0ef4cfed-a3e0-4990-a1b9-fceca4d4829b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:22:52.072 143779 DEBUG oslo_concurrency.lockutils [req-0ef4cfed-a3e0-4990-a1b9-fceca4d4829b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:22:53.105 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:22:53.106 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:22:53.106 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:22:53.108 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:22:53.108 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:22:53.109 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:22:53.120 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:22:53.120 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:22:53.120 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:22:53.150 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:22:53.151 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:22:53.151 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:23:00.083 143787 DEBUG oslo_service.periodic_task [req-1c019881-3f13-408a-8ca3-e7a2e3138ce1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:23:00.087 143787 DEBUG oslo_concurrency.lockutils [req-db01f816-5320-4bc6-8d31-120846ec4927 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:23:00.088 143787 DEBUG oslo_concurrency.lockutils [req-db01f816-5320-4bc6-8d31-120846ec4927 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:23:07.126 143781 DEBUG oslo_service.periodic_task [req-d9b1f992-5157-4f64-bce2-5822456679c8 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:23:07.131 143781 DEBUG oslo_concurrency.lockutils [req-d2dc37a0-1c6f-4e72-a41f-83626b94b119 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:23:07.131 143781 DEBUG oslo_concurrency.lockutils [req-d2dc37a0-1c6f-4e72-a41f-83626b94b119 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:23:21.115 143780 DEBUG oslo_service.periodic_task [req-07503d10-1776-43d6-bba8-5b6b062e5855 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:23:21.119 143780 DEBUG oslo_concurrency.lockutils [req-2776f22f-008b-4292-9289-859eb0fe6698 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:23:21.120 143780 DEBUG oslo_concurrency.lockutils [req-2776f22f-008b-4292-9289-859eb0fe6698 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:23:23.057 143779 DEBUG oslo_service.periodic_task [req-0ef4cfed-a3e0-4990-a1b9-fceca4d4829b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:23:23.061 143779 DEBUG oslo_concurrency.lockutils [req-9e965125-0f32-4f46-8c67-4fc81dd004af - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:23:23.061 143779 DEBUG oslo_concurrency.lockutils [req-9e965125-0f32-4f46-8c67-4fc81dd004af - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:23:30.096 143787 DEBUG oslo_service.periodic_task [req-db01f816-5320-4bc6-8d31-120846ec4927 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:23:30.101 143787 DEBUG oslo_concurrency.lockutils [req-224b08a1-7b3a-4fda-bc2e-eb07726b215b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:23:30.101 143787 DEBUG oslo_concurrency.lockutils [req-224b08a1-7b3a-4fda-bc2e-eb07726b215b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:23:37.137 143781 DEBUG oslo_service.periodic_task [req-d2dc37a0-1c6f-4e72-a41f-83626b94b119 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:23:37.141 143781 DEBUG oslo_concurrency.lockutils [req-2ebbcb62-043c-4c60-be37-04a4585eb100 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:23:37.141 143781 DEBUG oslo_concurrency.lockutils [req-2ebbcb62-043c-4c60-be37-04a4585eb100 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:23:50.810 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 0512acd7551940359075e129812ca6a9 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:23:50.810 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 0512acd7551940359075e129812ca6a9 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:23:50.810 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 0512acd7551940359075e129812ca6a9 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:23:50.810 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 0512acd7551940359075e129812ca6a9 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:23:50.810 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:23:50.810 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:23:50.810 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:23:50.810 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:23:50.810 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 0512acd7551940359075e129812ca6a9 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:23:50.810 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 0512acd7551940359075e129812ca6a9 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:23:50.810 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 0512acd7551940359075e129812ca6a9 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:23:50.810 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 0512acd7551940359075e129812ca6a9 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:23:50.810 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:23:50.810 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:23:50.810 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:23:50.810 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:23:50.810 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:23:50.810 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:23:50.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:23:50.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:23:50.811 143781 DEBUG oslo_concurrency.lockutils [req-f3b54002-3613-4d03-958e-56b359e264e0 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:23:50.811 143780 DEBUG oslo_concurrency.lockutils [req-f3b54002-3613-4d03-958e-56b359e264e0 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:23:50.811 143779 DEBUG oslo_concurrency.lockutils [req-f3b54002-3613-4d03-958e-56b359e264e0 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:23:50.811 143787 DEBUG oslo_concurrency.lockutils [req-f3b54002-3613-4d03-958e-56b359e264e0 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:23:50.811 143780 DEBUG nova.scheduler.host_manager [req-f3b54002-3613-4d03-958e-56b359e264e0 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:23:50.811 143781 DEBUG nova.scheduler.host_manager [req-f3b54002-3613-4d03-958e-56b359e264e0 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:23:50.811 143779 DEBUG nova.scheduler.host_manager [req-f3b54002-3613-4d03-958e-56b359e264e0 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:23:50.811 143787 DEBUG nova.scheduler.host_manager [req-f3b54002-3613-4d03-958e-56b359e264e0 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:23:50.811 143780 DEBUG oslo_concurrency.lockutils [req-f3b54002-3613-4d03-958e-56b359e264e0 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:23:50.812 143781 DEBUG oslo_concurrency.lockutils [req-f3b54002-3613-4d03-958e-56b359e264e0 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:23:50.812 143787 DEBUG oslo_concurrency.lockutils [req-f3b54002-3613-4d03-958e-56b359e264e0 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:23:50.812 143779 DEBUG oslo_concurrency.lockutils [req-f3b54002-3613-4d03-958e-56b359e264e0 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:23:50.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:23:50.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:23:50.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:23:50.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:23:50.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:23:50.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:23:50.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:23:50.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:23:50.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:23:50.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:23:50.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:23:50.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:23:51.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:23:51.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:23:51.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:23:51.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:23:51.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:23:51.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:23:51.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:23:51.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:23:51.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:23:51.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:23:51.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:23:51.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:23:52.044 143780 DEBUG oslo_service.periodic_task [req-2776f22f-008b-4292-9289-859eb0fe6698 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:23:52.048 143780 DEBUG oslo_concurrency.lockutils [req-2e836392-1460-499a-843a-f899a9017638 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:23:52.049 143780 DEBUG oslo_concurrency.lockutils [req-2e836392-1460-499a-843a-f899a9017638 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:23:53.067 143779 DEBUG oslo_service.periodic_task [req-9e965125-0f32-4f46-8c67-4fc81dd004af - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:23:53.071 143779 DEBUG oslo_concurrency.lockutils [req-ff6b4209-b5f9-4256-a207-815148f99af6 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:23:53.071 143779 DEBUG oslo_concurrency.lockutils [req-ff6b4209-b5f9-4256-a207-815148f99af6 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:23:53.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:23:53.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:23:53.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:23:53.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:23:53.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:23:53.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:23:53.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:23:53.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:23:53.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:23:53.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:23:53.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:23:53.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:23:57.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:23:57.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:23:57.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:23:57.819 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:23:57.819 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:23:57.819 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:23:57.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:23:57.821 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:23:57.821 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:23:57.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:23:57.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:23:57.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:24:00.109 143787 DEBUG oslo_service.periodic_task [req-224b08a1-7b3a-4fda-bc2e-eb07726b215b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:24:00.113 143787 DEBUG oslo_concurrency.lockutils [req-9379c45e-3e25-47fa-b141-3d667b0c518e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:24:00.113 143787 DEBUG oslo_concurrency.lockutils [req-9379c45e-3e25-47fa-b141-3d667b0c518e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:24:05.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:24:05.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:24:05.821 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:24:05.821 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:24:05.821 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:24:05.821 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:24:05.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:24:05.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:24:05.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:24:05.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:24:05.824 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:24:05.824 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:24:07.147 143781 DEBUG oslo_service.periodic_task [req-2ebbcb62-043c-4c60-be37-04a4585eb100 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:24:07.151 143781 DEBUG oslo_concurrency.lockutils [req-64fd6392-5968-48aa-a3ab-14a01cb1475c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:24:07.151 143781 DEBUG oslo_concurrency.lockutils [req-64fd6392-5968-48aa-a3ab-14a01cb1475c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:24:21.823 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:24:21.823 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:24:21.823 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:24:21.823 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:24:21.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:24:21.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:24:21.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:24:21.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:24:21.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:24:21.826 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:24:21.826 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:24:21.826 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:24:23.044 143780 DEBUG oslo_service.periodic_task [req-2e836392-1460-499a-843a-f899a9017638 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:24:23.048 143780 DEBUG oslo_concurrency.lockutils [req-7f0dd5d0-addb-4538-a946-514f3940a8da - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:24:23.048 143780 DEBUG oslo_concurrency.lockutils [req-7f0dd5d0-addb-4538-a946-514f3940a8da - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:24:23.076 143779 DEBUG oslo_service.periodic_task [req-ff6b4209-b5f9-4256-a207-815148f99af6 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:24:23.080 143779 DEBUG oslo_concurrency.lockutils [req-10683059-afd2-4b81-abac-d6b71342cae0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:24:23.080 143779 DEBUG oslo_concurrency.lockutils [req-10683059-afd2-4b81-abac-d6b71342cae0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:24:31.083 143787 DEBUG oslo_service.periodic_task [req-9379c45e-3e25-47fa-b141-3d667b0c518e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:24:31.087 143787 DEBUG oslo_concurrency.lockutils [req-a6b1ac4a-e906-4f23-8c79-ebd37d623bd4 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:24:31.087 143787 DEBUG oslo_concurrency.lockutils [req-a6b1ac4a-e906-4f23-8c79-ebd37d623bd4 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:24:37.156 143781 DEBUG oslo_service.periodic_task [req-64fd6392-5968-48aa-a3ab-14a01cb1475c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:24:37.161 143781 DEBUG oslo_concurrency.lockutils [req-02300679-48f5-4787-967a-3c801f5a0230 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:24:37.161 143781 DEBUG oslo_concurrency.lockutils [req-02300679-48f5-4787-967a-3c801f5a0230 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:24:53.085 143779 DEBUG oslo_service.periodic_task [req-10683059-afd2-4b81-abac-d6b71342cae0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:24:53.090 143779 DEBUG oslo_concurrency.lockutils [req-7b35db8d-6547-42e8-bbce-9e01f82032f3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:24:53.091 143779 DEBUG oslo_concurrency.lockutils [req-7b35db8d-6547-42e8-bbce-9e01f82032f3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:24:53.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:24:53.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:24:53.825 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:24:53.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:24:53.826 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:24:53.826 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:24:53.826 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:24:53.826 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:24:53.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:24:53.828 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:24:53.828 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:24:53.828 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:24:54.044 143780 DEBUG oslo_service.periodic_task [req-7f0dd5d0-addb-4538-a946-514f3940a8da - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:24:54.048 143780 DEBUG oslo_concurrency.lockutils [req-a957bd45-fe7a-4246-bbe1-0dbd1e05889b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:24:54.048 143780 DEBUG oslo_concurrency.lockutils [req-a957bd45-fe7a-4246-bbe1-0dbd1e05889b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:25:02.083 143787 DEBUG oslo_service.periodic_task [req-a6b1ac4a-e906-4f23-8c79-ebd37d623bd4 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:25:02.087 143787 DEBUG oslo_concurrency.lockutils [req-9df01d79-0cca-4d31-bc30-a0b39a9daa65 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:25:02.087 143787 DEBUG oslo_concurrency.lockutils [req-9df01d79-0cca-4d31-bc30-a0b39a9daa65 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:25:07.167 143781 DEBUG oslo_service.periodic_task [req-02300679-48f5-4787-967a-3c801f5a0230 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:25:07.170 143781 DEBUG oslo_concurrency.lockutils [req-f626de29-f258-461b-9bb8-de65358e875a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:25:07.171 143781 DEBUG oslo_concurrency.lockutils [req-f626de29-f258-461b-9bb8-de65358e875a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:25:23.102 143779 DEBUG oslo_service.periodic_task [req-7b35db8d-6547-42e8-bbce-9e01f82032f3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:25:23.106 143779 DEBUG oslo_concurrency.lockutils [req-c2958f49-02a9-4aac-8c5b-43bf7cf67a32 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:25:23.107 143779 DEBUG oslo_concurrency.lockutils [req-c2958f49-02a9-4aac-8c5b-43bf7cf67a32 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:25:25.044 143780 DEBUG oslo_service.periodic_task [req-a957bd45-fe7a-4246-bbe1-0dbd1e05889b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:25:25.048 143780 DEBUG oslo_concurrency.lockutils [req-019620be-87b6-453d-9856-0124b703e5a5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:25:25.049 143780 DEBUG oslo_concurrency.lockutils [req-019620be-87b6-453d-9856-0124b703e5a5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:25:32.098 143787 DEBUG oslo_service.periodic_task [req-9df01d79-0cca-4d31-bc30-a0b39a9daa65 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:25:32.101 143787 DEBUG oslo_concurrency.lockutils [req-5545c96a-9e91-4224-a1b0-58fec07d0a95 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:25:32.102 143787 DEBUG oslo_concurrency.lockutils [req-5545c96a-9e91-4224-a1b0-58fec07d0a95 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:25:37.177 143781 DEBUG oslo_service.periodic_task [req-f626de29-f258-461b-9bb8-de65358e875a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:25:37.181 143781 DEBUG oslo_concurrency.lockutils [req-29da3c4d-53d1-4b9e-9a96-490de5ab4576 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:25:37.181 143781 DEBUG oslo_concurrency.lockutils [req-29da3c4d-53d1-4b9e-9a96-490de5ab4576 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:25:51.810 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 0cb9abc65c33452cb3edf863dc84d95e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:25:51.810 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 0cb9abc65c33452cb3edf863dc84d95e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:25:51.810 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 0cb9abc65c33452cb3edf863dc84d95e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:25:51.810 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 0cb9abc65c33452cb3edf863dc84d95e __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:25:51.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:25:51.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:25:51.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:25:51.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 0cb9abc65c33452cb3edf863dc84d95e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:25:51.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 0cb9abc65c33452cb3edf863dc84d95e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:25:51.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:25:51.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 0cb9abc65c33452cb3edf863dc84d95e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:25:51.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:25:51.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 0cb9abc65c33452cb3edf863dc84d95e poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:25:51.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:25:51.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:25:51.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:25:51.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:25:51.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:25:51.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:25:51.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:25:51.812 143780 DEBUG oslo_concurrency.lockutils [req-b0e84c93-25fa-4d35-b124-f2b40e3fbe9e - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:25:51.812 143781 DEBUG oslo_concurrency.lockutils [req-b0e84c93-25fa-4d35-b124-f2b40e3fbe9e - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:25:51.812 143787 DEBUG oslo_concurrency.lockutils [req-b0e84c93-25fa-4d35-b124-f2b40e3fbe9e - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:25:51.812 143779 DEBUG oslo_concurrency.lockutils [req-b0e84c93-25fa-4d35-b124-f2b40e3fbe9e - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:25:51.812 143780 DEBUG nova.scheduler.host_manager [req-b0e84c93-25fa-4d35-b124-f2b40e3fbe9e - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:25:51.812 143787 DEBUG nova.scheduler.host_manager [req-b0e84c93-25fa-4d35-b124-f2b40e3fbe9e - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:25:51.812 143781 DEBUG nova.scheduler.host_manager [req-b0e84c93-25fa-4d35-b124-f2b40e3fbe9e - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:25:51.812 143779 DEBUG nova.scheduler.host_manager [req-b0e84c93-25fa-4d35-b124-f2b40e3fbe9e - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:25:51.812 143787 DEBUG oslo_concurrency.lockutils [req-b0e84c93-25fa-4d35-b124-f2b40e3fbe9e - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:25:51.812 143781 DEBUG oslo_concurrency.lockutils [req-b0e84c93-25fa-4d35-b124-f2b40e3fbe9e - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:25:51.812 143780 DEBUG oslo_concurrency.lockutils [req-b0e84c93-25fa-4d35-b124-f2b40e3fbe9e - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:25:51.812 143779 DEBUG oslo_concurrency.lockutils [req-b0e84c93-25fa-4d35-b124-f2b40e3fbe9e - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:25:51.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:25:51.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:25:51.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:25:51.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:25:51.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:25:51.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:25:51.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:25:51.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:25:51.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:25:51.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:25:51.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:25:51.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:25:52.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:25:52.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:25:52.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:25:52.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:25:52.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:25:52.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:25:52.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:25:52.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:25:52.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:25:52.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:25:52.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:25:52.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:25:53.112 143779 DEBUG oslo_service.periodic_task [req-c2958f49-02a9-4aac-8c5b-43bf7cf67a32 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:25:53.115 143779 DEBUG oslo_concurrency.lockutils [req-7ca70980-1d3b-43e4-ac7d-4ac1928714c7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:25:53.116 143779 DEBUG oslo_concurrency.lockutils [req-7ca70980-1d3b-43e4-ac7d-4ac1928714c7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:25:54.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:25:54.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:25:54.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:25:54.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:25:54.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:25:54.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:25:54.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:25:54.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:25:54.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:25:54.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:25:54.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:25:54.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:25:55.053 143780 DEBUG oslo_service.periodic_task [req-019620be-87b6-453d-9856-0124b703e5a5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:25:55.057 143780 DEBUG oslo_concurrency.lockutils [req-2fc82226-fe4a-4e33-8b3f-7adf149b7b65 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:25:55.057 143780 DEBUG oslo_concurrency.lockutils [req-2fc82226-fe4a-4e33-8b3f-7adf149b7b65 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:25:58.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:25:58.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:25:58.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:25:58.821 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:25:58.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:25:58.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:25:58.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:25:58.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:25:58.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:25:58.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:25:58.823 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:25:58.823 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:26:02.110 143787 DEBUG oslo_service.periodic_task [req-5545c96a-9e91-4224-a1b0-58fec07d0a95 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:26:02.113 143787 DEBUG oslo_concurrency.lockutils [req-29d20da5-48ee-4527-b54d-5dccd463e3a1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:26:02.114 143787 DEBUG oslo_concurrency.lockutils [req-29d20da5-48ee-4527-b54d-5dccd463e3a1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:26:06.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:26:06.821 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:26:06.821 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:26:06.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:26:06.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:26:06.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:26:06.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:26:06.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:26:06.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:26:06.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:26:06.826 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:26:06.826 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:26:07.185 143781 DEBUG oslo_service.periodic_task [req-29da3c4d-53d1-4b9e-9a96-490de5ab4576 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:26:07.189 143781 DEBUG oslo_concurrency.lockutils [req-7c7af3eb-f28f-4f00-8ca9-48662d60c9ce - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:26:07.189 143781 DEBUG oslo_concurrency.lockutils [req-7c7af3eb-f28f-4f00-8ca9-48662d60c9ce - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:26:22.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:26:22.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:26:22.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:26:22.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:26:22.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:26:22.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:26:22.826 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:26:22.826 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:26:22.826 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:26:22.827 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:26:22.827 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:26:22.827 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:26:23.121 143779 DEBUG oslo_service.periodic_task [req-7ca70980-1d3b-43e4-ac7d-4ac1928714c7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:26:23.126 143779 DEBUG oslo_concurrency.lockutils [req-29d7b133-7ef1-4983-a15e-068999835c36 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:26:23.126 143779 DEBUG oslo_concurrency.lockutils [req-29d7b133-7ef1-4983-a15e-068999835c36 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:26:25.062 143780 DEBUG oslo_service.periodic_task [req-2fc82226-fe4a-4e33-8b3f-7adf149b7b65 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:26:25.066 143780 DEBUG oslo_concurrency.lockutils [req-cac8f030-5abe-4eb8-a26c-0a9ce4a91686 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:26:25.066 143780 DEBUG oslo_concurrency.lockutils [req-cac8f030-5abe-4eb8-a26c-0a9ce4a91686 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:26:32.124 143787 DEBUG oslo_service.periodic_task [req-29d20da5-48ee-4527-b54d-5dccd463e3a1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:26:32.128 143787 DEBUG oslo_concurrency.lockutils [req-8640cb30-583a-428d-bb5f-79704408f71c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:26:32.129 143787 DEBUG oslo_concurrency.lockutils [req-8640cb30-583a-428d-bb5f-79704408f71c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:26:37.195 143781 DEBUG oslo_service.periodic_task [req-7c7af3eb-f28f-4f00-8ca9-48662d60c9ce - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:26:37.199 143781 DEBUG oslo_concurrency.lockutils [req-269c210f-69cf-4ca3-bfe8-7042e99587cf - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:26:37.199 143781 DEBUG oslo_concurrency.lockutils [req-269c210f-69cf-4ca3-bfe8-7042e99587cf - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:26:53.131 143779 DEBUG oslo_service.periodic_task [req-29d7b133-7ef1-4983-a15e-068999835c36 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:26:53.135 143779 DEBUG oslo_concurrency.lockutils [req-3e729ace-8f8d-41c6-bcdd-01810d3c99d6 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:26:53.136 143779 DEBUG oslo_concurrency.lockutils [req-3e729ace-8f8d-41c6-bcdd-01810d3c99d6 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:26:54.824 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:26:54.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:26:54.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:26:54.828 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:26:54.828 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:26:54.828 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:26:54.829 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:26:54.829 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:26:54.830 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:26:54.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:26:54.830 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:26:54.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:26:55.072 143780 DEBUG oslo_service.periodic_task [req-cac8f030-5abe-4eb8-a26c-0a9ce4a91686 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:26:55.076 143780 DEBUG oslo_concurrency.lockutils [req-34ed39aa-c90e-4e8e-b98c-a0cc14b882d2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:26:55.076 143780 DEBUG oslo_concurrency.lockutils [req-34ed39aa-c90e-4e8e-b98c-a0cc14b882d2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:27:03.084 143787 DEBUG oslo_service.periodic_task [req-8640cb30-583a-428d-bb5f-79704408f71c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:27:03.095 143787 DEBUG oslo_concurrency.lockutils [req-9b6251fc-4342-44f9-ae3e-aea3034ca6bc - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:27:03.096 143787 DEBUG oslo_concurrency.lockutils [req-9b6251fc-4342-44f9-ae3e-aea3034ca6bc - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:27:07.205 143781 DEBUG oslo_service.periodic_task [req-269c210f-69cf-4ca3-bfe8-7042e99587cf - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:27:07.209 143781 DEBUG oslo_concurrency.lockutils [req-2916dda5-a7bd-45a0-bb03-c715b50d76b9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:27:07.210 143781 DEBUG oslo_concurrency.lockutils [req-2916dda5-a7bd-45a0-bb03-c715b50d76b9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:27:23.140 143779 DEBUG oslo_service.periodic_task [req-3e729ace-8f8d-41c6-bcdd-01810d3c99d6 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:27:23.144 143779 DEBUG oslo_concurrency.lockutils [req-b99c4541-500a-4f9c-916b-24bc303fbc93 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:27:23.145 143779 DEBUG oslo_concurrency.lockutils [req-b99c4541-500a-4f9c-916b-24bc303fbc93 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:27:26.044 143780 DEBUG oslo_service.periodic_task [req-34ed39aa-c90e-4e8e-b98c-a0cc14b882d2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:27:26.048 143780 DEBUG oslo_concurrency.lockutils [req-a0e2cd67-80be-4d61-a0ba-cae737251345 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:27:26.048 143780 DEBUG oslo_concurrency.lockutils [req-a0e2cd67-80be-4d61-a0ba-cae737251345 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:27:34.083 143787 DEBUG oslo_service.periodic_task [req-9b6251fc-4342-44f9-ae3e-aea3034ca6bc - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:27:34.087 143787 DEBUG oslo_concurrency.lockutils [req-7e62d7e5-3150-4a32-82eb-95d3646b9708 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:27:34.087 143787 DEBUG oslo_concurrency.lockutils [req-7e62d7e5-3150-4a32-82eb-95d3646b9708 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:27:37.215 143781 DEBUG oslo_service.periodic_task [req-2916dda5-a7bd-45a0-bb03-c715b50d76b9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:27:37.220 143781 DEBUG oslo_concurrency.lockutils [req-b4f2c8cd-1a98-4d70-9c5e-2f413d097d53 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:27:37.220 143781 DEBUG oslo_concurrency.lockutils [req-b4f2c8cd-1a98-4d70-9c5e-2f413d097d53 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:27:52.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 539548fa33b744b0b1dbe48ecb43a5d1 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:27:52.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 539548fa33b744b0b1dbe48ecb43a5d1 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:27:52.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 539548fa33b744b0b1dbe48ecb43a5d1 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:27:52.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 539548fa33b744b0b1dbe48ecb43a5d1 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:27:52.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:27:52.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:27:52.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:27:52.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 539548fa33b744b0b1dbe48ecb43a5d1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:27:52.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 539548fa33b744b0b1dbe48ecb43a5d1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:27:52.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 539548fa33b744b0b1dbe48ecb43a5d1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:27:52.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:27:52.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:27:52.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:27:52.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:27:52.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 539548fa33b744b0b1dbe48ecb43a5d1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:27:52.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:27:52.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:27:52.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:27:52.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:27:52.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:27:52.814 143780 DEBUG oslo_concurrency.lockutils [req-655e9a65-f0f5-43ca-b97b-6d411a2cf7ef - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:27:52.814 143787 DEBUG oslo_concurrency.lockutils [req-655e9a65-f0f5-43ca-b97b-6d411a2cf7ef - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:27:52.814 143781 DEBUG oslo_concurrency.lockutils [req-655e9a65-f0f5-43ca-b97b-6d411a2cf7ef - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:27:52.814 143780 DEBUG nova.scheduler.host_manager [req-655e9a65-f0f5-43ca-b97b-6d411a2cf7ef - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:27:52.814 143787 DEBUG nova.scheduler.host_manager [req-655e9a65-f0f5-43ca-b97b-6d411a2cf7ef - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:27:52.814 143781 DEBUG nova.scheduler.host_manager [req-655e9a65-f0f5-43ca-b97b-6d411a2cf7ef - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:27:52.815 143780 DEBUG oslo_concurrency.lockutils [req-655e9a65-f0f5-43ca-b97b-6d411a2cf7ef - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:27:52.815 143787 DEBUG oslo_concurrency.lockutils [req-655e9a65-f0f5-43ca-b97b-6d411a2cf7ef - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:27:52.815 143781 DEBUG oslo_concurrency.lockutils [req-655e9a65-f0f5-43ca-b97b-6d411a2cf7ef - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:27:52.815 143779 DEBUG oslo_concurrency.lockutils [req-655e9a65-f0f5-43ca-b97b-6d411a2cf7ef - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:27:52.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:27:52.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:27:52.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:27:52.815 143779 DEBUG nova.scheduler.host_manager [req-655e9a65-f0f5-43ca-b97b-6d411a2cf7ef - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:27:52.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:27:52.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:27:52.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:27:52.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:27:52.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:27:52.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:27:52.815 143779 DEBUG oslo_concurrency.lockutils [req-655e9a65-f0f5-43ca-b97b-6d411a2cf7ef - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:27:52.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:27:52.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:27:52.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:27:53.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:27:53.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:27:53.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:27:53.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:27:53.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:27:53.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:27:53.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:27:53.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:27:53.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:27:53.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:27:53.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:27:53.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:27:54.057 143779 DEBUG oslo_service.periodic_task [req-b99c4541-500a-4f9c-916b-24bc303fbc93 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:27:54.061 143779 DEBUG oslo_concurrency.lockutils [req-9b54dcb0-c6f1-4a56-a199-1f5f77973142 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:27:54.062 143779 DEBUG oslo_concurrency.lockutils [req-9b54dcb0-c6f1-4a56-a199-1f5f77973142 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:27:55.819 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:27:55.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:27:55.819 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:27:55.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:27:55.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:27:55.819 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:27:55.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:27:55.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:27:55.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:27:55.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:27:55.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:27:55.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:27:56.053 143780 DEBUG oslo_service.periodic_task [req-a0e2cd67-80be-4d61-a0ba-cae737251345 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:27:56.058 143780 DEBUG oslo_concurrency.lockutils [req-1b6699c0-0126-49fa-978f-8d21f3f3a9ce - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:27:56.059 143780 DEBUG oslo_concurrency.lockutils [req-1b6699c0-0126-49fa-978f-8d21f3f3a9ce - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:27:59.821 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:27:59.821 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:27:59.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:27:59.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:27:59.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:27:59.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:27:59.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:27:59.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:27:59.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:27:59.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:27:59.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:27:59.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:28:05.083 143787 DEBUG oslo_service.periodic_task [req-7e62d7e5-3150-4a32-82eb-95d3646b9708 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:28:05.087 143787 DEBUG oslo_concurrency.lockutils [req-46eb0f5f-42d2-468d-9b3f-94a9dda9f742 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:28:05.087 143787 DEBUG oslo_concurrency.lockutils [req-46eb0f5f-42d2-468d-9b3f-94a9dda9f742 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:28:07.226 143781 DEBUG oslo_service.periodic_task [req-b4f2c8cd-1a98-4d70-9c5e-2f413d097d53 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:28:07.230 143781 DEBUG oslo_concurrency.lockutils [req-e03a3284-6a93-4c80-ba70-fbc7bc9ddb79 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:28:07.230 143781 DEBUG oslo_concurrency.lockutils [req-e03a3284-6a93-4c80-ba70-fbc7bc9ddb79 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:28:07.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:28:07.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:28:07.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:28:07.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:28:07.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:28:07.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:28:07.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:28:07.826 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:28:07.826 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:28:07.827 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:28:07.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:28:07.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:28:23.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:28:23.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:28:23.827 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:28:23.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:28:23.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:28:23.827 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:28:23.827 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:28:23.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:28:23.828 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:28:23.830 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:28:23.830 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:28:23.831 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:28:24.067 143779 DEBUG oslo_service.periodic_task [req-9b54dcb0-c6f1-4a56-a199-1f5f77973142 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:28:24.071 143779 DEBUG oslo_concurrency.lockutils [req-140ef4fe-7f72-4c7f-90f6-693999292f6d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:28:24.071 143779 DEBUG oslo_concurrency.lockutils [req-140ef4fe-7f72-4c7f-90f6-693999292f6d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:28:27.045 143780 DEBUG oslo_service.periodic_task [req-1b6699c0-0126-49fa-978f-8d21f3f3a9ce - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:28:27.052 143780 DEBUG oslo_concurrency.lockutils [req-5427a459-555b-4282-8e02-9995b963eb35 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:28:27.053 143780 DEBUG oslo_concurrency.lockutils [req-5427a459-555b-4282-8e02-9995b963eb35 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:28:36.083 143787 DEBUG oslo_service.periodic_task [req-46eb0f5f-42d2-468d-9b3f-94a9dda9f742 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:28:36.087 143787 DEBUG oslo_concurrency.lockutils [req-c38d0aed-a1c1-460f-89b4-ca01b6bb09ec - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:28:36.087 143787 DEBUG oslo_concurrency.lockutils [req-c38d0aed-a1c1-460f-89b4-ca01b6bb09ec - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:28:38.041 143781 DEBUG oslo_service.periodic_task [req-e03a3284-6a93-4c80-ba70-fbc7bc9ddb79 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:28:38.047 143781 DEBUG oslo_concurrency.lockutils [req-1e93040c-a6d8-4831-8b0f-ad2b576cd9cc - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:28:38.047 143781 DEBUG oslo_concurrency.lockutils [req-1e93040c-a6d8-4831-8b0f-ad2b576cd9cc - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:28:55.058 143779 DEBUG oslo_service.periodic_task [req-140ef4fe-7f72-4c7f-90f6-693999292f6d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:28:55.062 143779 DEBUG oslo_concurrency.lockutils [req-254c5509-0c97-44d8-9692-c635aa4c7f40 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:28:55.063 143779 DEBUG oslo_concurrency.lockutils [req-254c5509-0c97-44d8-9692-c635aa4c7f40 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:28:55.829 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:28:55.829 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:28:55.829 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:28:55.829 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:28:55.829 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:28:55.830 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:28:55.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:28:55.830 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:28:55.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:28:55.832 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:28:55.832 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:28:55.832 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:28:57.058 143780 DEBUG oslo_service.periodic_task [req-5427a459-555b-4282-8e02-9995b963eb35 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:28:57.063 143780 DEBUG oslo_concurrency.lockutils [req-3cfe058d-ab8b-4c06-aa7c-fc186f78e3bd - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:28:57.063 143780 DEBUG oslo_concurrency.lockutils [req-3cfe058d-ab8b-4c06-aa7c-fc186f78e3bd - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:29:07.082 143787 DEBUG oslo_service.periodic_task [req-c38d0aed-a1c1-460f-89b4-ca01b6bb09ec - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:29:07.087 143787 DEBUG oslo_concurrency.lockutils [req-fe5625ed-9246-4638-a842-c44b6c5f53c9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:29:07.087 143787 DEBUG oslo_concurrency.lockutils [req-fe5625ed-9246-4638-a842-c44b6c5f53c9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:29:08.054 143781 DEBUG oslo_service.periodic_task [req-1e93040c-a6d8-4831-8b0f-ad2b576cd9cc - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:29:08.058 143781 DEBUG oslo_concurrency.lockutils [req-a62350dd-5883-450c-aa23-7f1bddf8ee81 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:29:08.059 143781 DEBUG oslo_concurrency.lockutils [req-a62350dd-5883-450c-aa23-7f1bddf8ee81 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:29:26.057 143779 DEBUG oslo_service.periodic_task [req-254c5509-0c97-44d8-9692-c635aa4c7f40 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:29:26.061 143779 DEBUG oslo_concurrency.lockutils [req-05035298-8230-4d0e-bf6f-09bdf6dd0517 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:29:26.062 143779 DEBUG oslo_concurrency.lockutils [req-05035298-8230-4d0e-bf6f-09bdf6dd0517 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:29:28.044 143780 DEBUG oslo_service.periodic_task [req-3cfe058d-ab8b-4c06-aa7c-fc186f78e3bd - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:29:28.048 143780 DEBUG oslo_concurrency.lockutils [req-64733c9e-8878-48c2-a99f-bb052cfc32b2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:29:28.049 143780 DEBUG oslo_concurrency.lockutils [req-64733c9e-8878-48c2-a99f-bb052cfc32b2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:29:37.092 143787 DEBUG oslo_service.periodic_task [req-fe5625ed-9246-4638-a842-c44b6c5f53c9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:29:37.097 143787 DEBUG oslo_concurrency.lockutils [req-513ae8c6-dc68-405c-ab3a-aece89a71f23 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:29:37.097 143787 DEBUG oslo_concurrency.lockutils [req-513ae8c6-dc68-405c-ab3a-aece89a71f23 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:29:38.066 143781 DEBUG oslo_service.periodic_task [req-a62350dd-5883-450c-aa23-7f1bddf8ee81 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:29:38.070 143781 DEBUG oslo_concurrency.lockutils [req-5f419996-2027-45a3-8ab1-f8bdeb7e5e8d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:29:38.070 143781 DEBUG oslo_concurrency.lockutils [req-5f419996-2027-45a3-8ab1-f8bdeb7e5e8d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:29:54.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: a80d2a5bcc6b478c80c6fc0850c8c203 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:29:54.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: a80d2a5bcc6b478c80c6fc0850c8c203 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:29:54.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: a80d2a5bcc6b478c80c6fc0850c8c203 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:29:54.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: a80d2a5bcc6b478c80c6fc0850c8c203 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:29:54.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:29:54.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:29:54.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:29:54.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: a80d2a5bcc6b478c80c6fc0850c8c203 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:29:54.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: a80d2a5bcc6b478c80c6fc0850c8c203 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:29:54.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: a80d2a5bcc6b478c80c6fc0850c8c203 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:29:54.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:29:54.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:29:54.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: a80d2a5bcc6b478c80c6fc0850c8c203 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:29:54.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:29:54.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:29:54.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:29:54.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:29:54.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:29:54.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:29:54.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:29:54.814 143787 DEBUG oslo_concurrency.lockutils [req-7e4c7ad8-db38-4e30-8958-455d1626edc9 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:29:54.814 143781 DEBUG oslo_concurrency.lockutils [req-7e4c7ad8-db38-4e30-8958-455d1626edc9 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:29:54.814 143779 DEBUG oslo_concurrency.lockutils [req-7e4c7ad8-db38-4e30-8958-455d1626edc9 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:29:54.814 143787 DEBUG nova.scheduler.host_manager [req-7e4c7ad8-db38-4e30-8958-455d1626edc9 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:29:54.814 143781 DEBUG nova.scheduler.host_manager [req-7e4c7ad8-db38-4e30-8958-455d1626edc9 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:29:54.814 143779 DEBUG nova.scheduler.host_manager [req-7e4c7ad8-db38-4e30-8958-455d1626edc9 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:29:54.815 143787 DEBUG oslo_concurrency.lockutils [req-7e4c7ad8-db38-4e30-8958-455d1626edc9 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:29:54.815 143779 DEBUG oslo_concurrency.lockutils [req-7e4c7ad8-db38-4e30-8958-455d1626edc9 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:29:54.815 143781 DEBUG oslo_concurrency.lockutils [req-7e4c7ad8-db38-4e30-8958-455d1626edc9 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:29:54.815 143780 DEBUG oslo_concurrency.lockutils [req-7e4c7ad8-db38-4e30-8958-455d1626edc9 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:29:54.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:29:54.815 143780 DEBUG nova.scheduler.host_manager [req-7e4c7ad8-db38-4e30-8958-455d1626edc9 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:29:54.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:29:54.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:29:54.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:29:54.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:29:54.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:29:54.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:29:54.815 143780 DEBUG oslo_concurrency.lockutils [req-7e4c7ad8-db38-4e30-8958-455d1626edc9 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:29:54.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:29:54.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:29:54.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:29:54.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:29:54.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:29:55.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:29:55.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:29:55.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:29:55.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:29:55.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:29:55.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:29:55.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:29:55.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:29:55.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:29:55.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:29:55.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:29:55.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:29:56.067 143779 DEBUG oslo_service.periodic_task [req-05035298-8230-4d0e-bf6f-09bdf6dd0517 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:29:56.071 143779 DEBUG oslo_concurrency.lockutils [req-7dde8262-d268-4bf9-a6f7-70dd4999cd34 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:29:56.072 143779 DEBUG oslo_concurrency.lockutils [req-7dde8262-d268-4bf9-a6f7-70dd4999cd34 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:29:57.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:29:57.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:29:57.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:29:57.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:29:57.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:29:57.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:29:57.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:29:57.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:29:57.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:29:57.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:29:57.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:29:57.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:29:58.053 143780 DEBUG oslo_service.periodic_task [req-64733c9e-8878-48c2-a99f-bb052cfc32b2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:29:58.058 143780 DEBUG oslo_concurrency.lockutils [req-3eb5bd0e-7d06-45e4-8619-f9ce1d84a67e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:29:58.058 143780 DEBUG oslo_concurrency.lockutils [req-3eb5bd0e-7d06-45e4-8619-f9ce1d84a67e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:30:01.821 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:30:01.821 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:30:01.821 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:30:01.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:30:01.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:30:01.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:30:01.823 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:30:01.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:30:01.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:30:01.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:30:01.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:30:01.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:30:07.103 143787 DEBUG oslo_service.periodic_task [req-513ae8c6-dc68-405c-ab3a-aece89a71f23 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:30:07.108 143787 DEBUG oslo_concurrency.lockutils [req-6f5d5e8d-0fd3-4783-84fe-07f000fa169a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:30:07.108 143787 DEBUG oslo_concurrency.lockutils [req-6f5d5e8d-0fd3-4783-84fe-07f000fa169a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:30:09.041 143781 DEBUG oslo_service.periodic_task [req-5f419996-2027-45a3-8ab1-f8bdeb7e5e8d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:30:09.045 143781 DEBUG oslo_concurrency.lockutils [req-8b337921-b79d-475e-ab0d-3b65a8dc11fe - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:30:09.045 143781 DEBUG oslo_concurrency.lockutils [req-8b337921-b79d-475e-ab0d-3b65a8dc11fe - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:30:09.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:30:09.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:30:09.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:30:09.827 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:30:09.827 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:30:09.827 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:30:09.828 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:30:09.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:30:09.828 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:30:09.828 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:30:09.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:30:09.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:30:25.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:30:25.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:30:25.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:30:25.829 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:30:25.829 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:30:25.829 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:30:25.829 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:30:25.830 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:30:25.830 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:30:25.830 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:30:25.830 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:30:25.830 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:30:26.076 143779 DEBUG oslo_service.periodic_task [req-7dde8262-d268-4bf9-a6f7-70dd4999cd34 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:30:26.080 143779 DEBUG oslo_concurrency.lockutils [req-246384e6-4352-4389-bbdc-0332e1c911ed - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:30:26.081 143779 DEBUG oslo_concurrency.lockutils [req-246384e6-4352-4389-bbdc-0332e1c911ed - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:30:28.064 143780 DEBUG oslo_service.periodic_task [req-3eb5bd0e-7d06-45e4-8619-f9ce1d84a67e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:30:28.068 143780 DEBUG oslo_concurrency.lockutils [req-95672229-f14d-435f-8b88-01a5383a95af - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:30:28.068 143780 DEBUG oslo_concurrency.lockutils [req-95672229-f14d-435f-8b88-01a5383a95af - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:30:38.083 143787 DEBUG oslo_service.periodic_task [req-6f5d5e8d-0fd3-4783-84fe-07f000fa169a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:30:38.091 143787 DEBUG oslo_concurrency.lockutils [req-bf9ef6b9-bdcd-408a-90e8-f894f1732195 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:30:38.091 143787 DEBUG oslo_concurrency.lockutils [req-bf9ef6b9-bdcd-408a-90e8-f894f1732195 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:30:39.051 143781 DEBUG oslo_service.periodic_task [req-8b337921-b79d-475e-ab0d-3b65a8dc11fe - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:30:39.055 143781 DEBUG oslo_concurrency.lockutils [req-8df6d6ab-6575-4944-89f6-9dea2cf61aa5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:30:39.055 143781 DEBUG oslo_concurrency.lockutils [req-8df6d6ab-6575-4944-89f6-9dea2cf61aa5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:30:57.057 143779 DEBUG oslo_service.periodic_task [req-246384e6-4352-4389-bbdc-0332e1c911ed - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:30:57.061 143779 DEBUG oslo_concurrency.lockutils [req-02c0b97a-c19c-4500-9f6e-7af9a9de419f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:30:57.062 143779 DEBUG oslo_concurrency.lockutils [req-02c0b97a-c19c-4500-9f6e-7af9a9de419f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:30:57.826 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:30:57.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:30:57.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:30:57.832 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:30:57.832 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:30:57.832 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:30:57.832 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:30:57.833 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:30:57.833 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:30:57.832 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:30:57.833 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:30:57.833 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:30:58.073 143780 DEBUG oslo_service.periodic_task [req-95672229-f14d-435f-8b88-01a5383a95af - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:30:58.077 143780 DEBUG oslo_concurrency.lockutils [req-ae258577-fc7c-4779-84b6-7827ca3682a2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:30:58.077 143780 DEBUG oslo_concurrency.lockutils [req-ae258577-fc7c-4779-84b6-7827ca3682a2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:31:08.099 143787 DEBUG oslo_service.periodic_task [req-bf9ef6b9-bdcd-408a-90e8-f894f1732195 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:31:08.104 143787 DEBUG oslo_concurrency.lockutils [req-90735d55-0302-4ad1-9939-9e48b2f87e31 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:31:08.104 143787 DEBUG oslo_concurrency.lockutils [req-90735d55-0302-4ad1-9939-9e48b2f87e31 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:31:09.060 143781 DEBUG oslo_service.periodic_task [req-8df6d6ab-6575-4944-89f6-9dea2cf61aa5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:31:09.064 143781 DEBUG oslo_concurrency.lockutils [req-c7d54421-c7b4-41ce-9cab-9554e6f5d69e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:31:09.064 143781 DEBUG oslo_concurrency.lockutils [req-c7d54421-c7b4-41ce-9cab-9554e6f5d69e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:31:28.057 143779 DEBUG oslo_service.periodic_task [req-02c0b97a-c19c-4500-9f6e-7af9a9de419f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:31:28.062 143779 DEBUG oslo_concurrency.lockutils [req-26103899-a4fe-4599-92ea-12772b2297ba - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:31:28.062 143779 DEBUG oslo_concurrency.lockutils [req-26103899-a4fe-4599-92ea-12772b2297ba - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:31:29.044 143780 DEBUG oslo_service.periodic_task [req-ae258577-fc7c-4779-84b6-7827ca3682a2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:31:29.048 143780 DEBUG oslo_concurrency.lockutils [req-63815673-5c7d-4f3b-a00c-80c805841e02 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:31:29.048 143780 DEBUG oslo_concurrency.lockutils [req-63815673-5c7d-4f3b-a00c-80c805841e02 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:31:38.112 143787 DEBUG oslo_service.periodic_task [req-90735d55-0302-4ad1-9939-9e48b2f87e31 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:31:38.116 143787 DEBUG oslo_concurrency.lockutils [req-3202156f-0a1e-4a81-b406-89305392957a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:31:38.116 143787 DEBUG oslo_concurrency.lockutils [req-3202156f-0a1e-4a81-b406-89305392957a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:31:39.069 143781 DEBUG oslo_service.periodic_task [req-c7d54421-c7b4-41ce-9cab-9554e6f5d69e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:31:39.073 143781 DEBUG oslo_concurrency.lockutils [req-687d5e6e-c240-4825-945a-f93b486e4508 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:31:39.073 143781 DEBUG oslo_concurrency.lockutils [req-687d5e6e-c240-4825-945a-f93b486e4508 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:31:54.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d8df83c2ec4a4559bbb0fa8ce87dec69 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:31:54.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d8df83c2ec4a4559bbb0fa8ce87dec69 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:31:54.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d8df83c2ec4a4559bbb0fa8ce87dec69 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:31:54.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d8df83c2ec4a4559bbb0fa8ce87dec69 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:31:54.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:31:54.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:31:54.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d8df83c2ec4a4559bbb0fa8ce87dec69 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:31:54.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:31:54.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d8df83c2ec4a4559bbb0fa8ce87dec69 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:31:54.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:31:54.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d8df83c2ec4a4559bbb0fa8ce87dec69 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:31:54.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d8df83c2ec4a4559bbb0fa8ce87dec69 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:31:54.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:31:54.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:31:54.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:31:54.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:31:54.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:31:54.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:31:54.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:31:54.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:31:54.815 143779 DEBUG oslo_concurrency.lockutils [req-756e74a7-42ba-4145-a871-b440f2968ee9 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:31:54.815 143781 DEBUG oslo_concurrency.lockutils [req-756e74a7-42ba-4145-a871-b440f2968ee9 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:31:54.815 143779 DEBUG nova.scheduler.host_manager [req-756e74a7-42ba-4145-a871-b440f2968ee9 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:31:54.815 143787 DEBUG oslo_concurrency.lockutils [req-756e74a7-42ba-4145-a871-b440f2968ee9 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:31:54.815 143780 DEBUG oslo_concurrency.lockutils [req-756e74a7-42ba-4145-a871-b440f2968ee9 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:31:54.815 143781 DEBUG nova.scheduler.host_manager [req-756e74a7-42ba-4145-a871-b440f2968ee9 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:31:54.815 143779 DEBUG oslo_concurrency.lockutils [req-756e74a7-42ba-4145-a871-b440f2968ee9 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:31:54.815 143787 DEBUG nova.scheduler.host_manager [req-756e74a7-42ba-4145-a871-b440f2968ee9 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:31:54.815 143780 DEBUG nova.scheduler.host_manager [req-756e74a7-42ba-4145-a871-b440f2968ee9 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:31:54.816 143781 DEBUG oslo_concurrency.lockutils [req-756e74a7-42ba-4145-a871-b440f2968ee9 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:31:54.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:31:54.816 143787 DEBUG oslo_concurrency.lockutils [req-756e74a7-42ba-4145-a871-b440f2968ee9 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:31:54.816 143780 DEBUG oslo_concurrency.lockutils [req-756e74a7-42ba-4145-a871-b440f2968ee9 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:31:54.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:31:54.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:31:54.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:31:54.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:31:54.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:31:54.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:31:54.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:31:54.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:31:54.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:31:54.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:31:54.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:31:55.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:31:55.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:31:55.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:31:55.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:31:55.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:31:55.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:31:55.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:31:55.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:31:55.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:31:55.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:31:55.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:31:55.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:31:57.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:31:57.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:31:57.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:31:57.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:31:57.819 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:31:57.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:31:57.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:31:57.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:31:57.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:31:57.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:31:57.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:31:57.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:31:58.067 143779 DEBUG oslo_service.periodic_task [req-26103899-a4fe-4599-92ea-12772b2297ba - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:31:58.071 143779 DEBUG oslo_concurrency.lockutils [req-4dc30ae9-0c46-4300-abf0-4ee54f195d57 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:31:58.071 143779 DEBUG oslo_concurrency.lockutils [req-4dc30ae9-0c46-4300-abf0-4ee54f195d57 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:31:59.055 143780 DEBUG oslo_service.periodic_task [req-63815673-5c7d-4f3b-a00c-80c805841e02 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:31:59.058 143780 DEBUG oslo_concurrency.lockutils [req-48387660-4f8a-47f0-8427-ead7bac927c8 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:31:59.059 143780 DEBUG oslo_concurrency.lockutils [req-48387660-4f8a-47f0-8427-ead7bac927c8 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:32:01.821 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:32:01.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:32:01.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:32:01.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:32:01.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:32:01.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:32:01.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:32:01.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:32:01.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:32:01.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:32:01.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:32:01.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:32:08.123 143787 DEBUG oslo_service.periodic_task [req-3202156f-0a1e-4a81-b406-89305392957a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:32:08.127 143787 DEBUG oslo_concurrency.lockutils [req-c78cc530-dc26-4d41-acb5-a975b5193cb8 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:32:08.128 143787 DEBUG oslo_concurrency.lockutils [req-c78cc530-dc26-4d41-acb5-a975b5193cb8 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:32:09.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:32:09.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:32:09.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:32:09.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:32:09.826 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:32:09.826 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:32:09.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:32:09.828 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:32:09.828 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:32:09.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:32:09.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:32:09.829 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:32:10.042 143781 DEBUG oslo_service.periodic_task [req-687d5e6e-c240-4825-945a-f93b486e4508 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:32:10.045 143781 DEBUG oslo_concurrency.lockutils [req-bddf31a0-3c19-451b-a680-5239e6991d07 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:32:10.046 143781 DEBUG oslo_concurrency.lockutils [req-bddf31a0-3c19-451b-a680-5239e6991d07 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:32:25.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:32:25.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:32:25.826 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:32:25.827 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:32:25.827 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:32:25.827 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:32:25.829 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:32:25.829 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:32:25.829 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:32:25.829 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:32:25.830 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:32:25.830 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:32:28.079 143779 DEBUG oslo_service.periodic_task [req-4dc30ae9-0c46-4300-abf0-4ee54f195d57 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:32:28.083 143779 DEBUG oslo_concurrency.lockutils [req-4da24b6e-a6ef-4af7-884c-7917572277af - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:32:28.083 143779 DEBUG oslo_concurrency.lockutils [req-4da24b6e-a6ef-4af7-884c-7917572277af - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:32:29.066 143780 DEBUG oslo_service.periodic_task [req-48387660-4f8a-47f0-8427-ead7bac927c8 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:32:29.069 143780 DEBUG oslo_concurrency.lockutils [req-74071084-cf54-48e6-be37-594a2306cc4a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:32:29.070 143780 DEBUG oslo_concurrency.lockutils [req-74071084-cf54-48e6-be37-594a2306cc4a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:32:38.136 143787 DEBUG oslo_service.periodic_task [req-c78cc530-dc26-4d41-acb5-a975b5193cb8 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:32:38.140 143787 DEBUG oslo_concurrency.lockutils [req-097fc8ae-c255-4072-9370-f58bc60dce50 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:32:38.141 143787 DEBUG oslo_concurrency.lockutils [req-097fc8ae-c255-4072-9370-f58bc60dce50 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:32:40.051 143781 DEBUG oslo_service.periodic_task [req-bddf31a0-3c19-451b-a680-5239e6991d07 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:32:40.055 143781 DEBUG oslo_concurrency.lockutils [req-86bc5037-7d7e-472c-9014-ea4cb213446e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:32:40.056 143781 DEBUG oslo_concurrency.lockutils [req-86bc5037-7d7e-472c-9014-ea4cb213446e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:32:57.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:32:57.828 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:32:57.828 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:32:57.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:32:57.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:32:57.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:32:57.832 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:32:57.833 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:32:57.833 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:32:57.833 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:32:57.833 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:32:57.833 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:32:58.088 143779 DEBUG oslo_service.periodic_task [req-4da24b6e-a6ef-4af7-884c-7917572277af - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:32:58.092 143779 DEBUG oslo_concurrency.lockutils [req-b2d280f8-392a-44d0-b57e-8e6fe1346495 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:32:58.093 143779 DEBUG oslo_concurrency.lockutils [req-b2d280f8-392a-44d0-b57e-8e6fe1346495 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:32:59.075 143780 DEBUG oslo_service.periodic_task [req-74071084-cf54-48e6-be37-594a2306cc4a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:32:59.079 143780 DEBUG oslo_concurrency.lockutils [req-11acfce3-6794-4397-9038-3af7eb1e5073 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:32:59.079 143780 DEBUG oslo_concurrency.lockutils [req-11acfce3-6794-4397-9038-3af7eb1e5073 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:33:09.082 143787 DEBUG oslo_service.periodic_task [req-097fc8ae-c255-4072-9370-f58bc60dce50 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:33:09.088 143787 DEBUG oslo_concurrency.lockutils [req-5206e83f-7df8-421b-836b-67ac8cee5050 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:33:09.088 143787 DEBUG oslo_concurrency.lockutils [req-5206e83f-7df8-421b-836b-67ac8cee5050 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:33:10.063 143781 DEBUG oslo_service.periodic_task [req-86bc5037-7d7e-472c-9014-ea4cb213446e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:33:10.069 143781 DEBUG oslo_concurrency.lockutils [req-201905b2-19e5-43d1-8a8e-7fc3e3a08d50 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:33:10.070 143781 DEBUG oslo_concurrency.lockutils [req-201905b2-19e5-43d1-8a8e-7fc3e3a08d50 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:33:28.100 143779 DEBUG oslo_service.periodic_task [req-b2d280f8-392a-44d0-b57e-8e6fe1346495 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:33:28.104 143779 DEBUG oslo_concurrency.lockutils [req-b4f71aa5-0a7d-419e-a6ca-28f9177bb6e8 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:33:28.105 143779 DEBUG oslo_concurrency.lockutils [req-b4f71aa5-0a7d-419e-a6ca-28f9177bb6e8 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:33:29.088 143780 DEBUG oslo_service.periodic_task [req-11acfce3-6794-4397-9038-3af7eb1e5073 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:33:29.093 143780 DEBUG oslo_concurrency.lockutils [req-7aee641a-a017-4d87-9dbf-0fb30272cfbe - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:33:29.093 143780 DEBUG oslo_concurrency.lockutils [req-7aee641a-a017-4d87-9dbf-0fb30272cfbe - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:33:39.094 143787 DEBUG oslo_service.periodic_task [req-5206e83f-7df8-421b-836b-67ac8cee5050 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:33:39.099 143787 DEBUG oslo_concurrency.lockutils [req-eaa62e3a-fbb5-4efe-86c9-3d9e95e923ba - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:33:39.099 143787 DEBUG oslo_concurrency.lockutils [req-eaa62e3a-fbb5-4efe-86c9-3d9e95e923ba - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:33:40.077 143781 DEBUG oslo_service.periodic_task [req-201905b2-19e5-43d1-8a8e-7fc3e3a08d50 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:33:40.081 143781 DEBUG oslo_concurrency.lockutils [req-6bebbe83-0225-4cb4-95a1-a1c57b073338 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:33:40.082 143781 DEBUG oslo_concurrency.lockutils [req-6bebbe83-0225-4cb4-95a1-a1c57b073338 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:33:54.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 9b1d693b735a461c8fe1a6ca53ed9e47 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:33:54.812 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 9b1d693b735a461c8fe1a6ca53ed9e47 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:33:54.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 9b1d693b735a461c8fe1a6ca53ed9e47 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:33:54.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 9b1d693b735a461c8fe1a6ca53ed9e47 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:33:54.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:33:54.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:33:54.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9b1d693b735a461c8fe1a6ca53ed9e47 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:33:54.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:33:54.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9b1d693b735a461c8fe1a6ca53ed9e47 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:33:54.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9b1d693b735a461c8fe1a6ca53ed9e47 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:33:54.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:33:54.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:33:54.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:33:54.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:33:54.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:33:54.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9b1d693b735a461c8fe1a6ca53ed9e47 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:33:54.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:33:54.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:33:54.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:33:54.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:33:54.814 143780 DEBUG oslo_concurrency.lockutils [req-84173d41-b3e0-4ae9-8da6-69bdbb0d28c8 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:33:54.814 143787 DEBUG oslo_concurrency.lockutils [req-84173d41-b3e0-4ae9-8da6-69bdbb0d28c8 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:33:54.814 143781 DEBUG oslo_concurrency.lockutils [req-84173d41-b3e0-4ae9-8da6-69bdbb0d28c8 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:33:54.814 143780 DEBUG nova.scheduler.host_manager [req-84173d41-b3e0-4ae9-8da6-69bdbb0d28c8 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:33:54.814 143787 DEBUG nova.scheduler.host_manager [req-84173d41-b3e0-4ae9-8da6-69bdbb0d28c8 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:33:54.814 143781 DEBUG nova.scheduler.host_manager [req-84173d41-b3e0-4ae9-8da6-69bdbb0d28c8 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:33:54.814 143780 DEBUG oslo_concurrency.lockutils [req-84173d41-b3e0-4ae9-8da6-69bdbb0d28c8 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:33:54.814 143787 DEBUG oslo_concurrency.lockutils [req-84173d41-b3e0-4ae9-8da6-69bdbb0d28c8 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:33:54.814 143781 DEBUG oslo_concurrency.lockutils [req-84173d41-b3e0-4ae9-8da6-69bdbb0d28c8 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:33:54.814 143779 DEBUG oslo_concurrency.lockutils [req-84173d41-b3e0-4ae9-8da6-69bdbb0d28c8 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:33:54.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:33:54.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:33:54.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:33:54.815 143779 DEBUG nova.scheduler.host_manager [req-84173d41-b3e0-4ae9-8da6-69bdbb0d28c8 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:33:54.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:33:54.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:33:54.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:33:54.815 143779 DEBUG oslo_concurrency.lockutils [req-84173d41-b3e0-4ae9-8da6-69bdbb0d28c8 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:33:54.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:33:54.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:33:54.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:33:54.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:33:54.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:33:54.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:33:55.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:33:55.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:33:55.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:33:55.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:33:55.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:33:55.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:33:55.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:33:55.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:33:55.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:33:55.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:33:55.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:33:55.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:33:57.817 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:33:57.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:33:57.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:33:57.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:33:57.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:33:57.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:33:57.819 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:33:57.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:33:57.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:33:57.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:33:57.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:33:57.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:33:58.110 143779 DEBUG oslo_service.periodic_task [req-b4f71aa5-0a7d-419e-a6ca-28f9177bb6e8 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:33:58.114 143779 DEBUG oslo_concurrency.lockutils [req-be5db46b-72e8-4667-95c4-6b6ac653d5b5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:33:58.115 143779 DEBUG oslo_concurrency.lockutils [req-be5db46b-72e8-4667-95c4-6b6ac653d5b5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:33:59.098 143780 DEBUG oslo_service.periodic_task [req-7aee641a-a017-4d87-9dbf-0fb30272cfbe - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:33:59.103 143780 DEBUG oslo_concurrency.lockutils [req-6a57d3f8-36fb-4a8b-a851-b183d3759a2a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:33:59.104 143780 DEBUG oslo_concurrency.lockutils [req-6a57d3f8-36fb-4a8b-a851-b183d3759a2a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:34:01.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:34:01.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:34:01.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:34:01.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:34:01.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:34:01.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:34:01.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:34:01.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:34:01.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:34:01.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:34:01.824 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:34:01.824 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:34:09.824 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:34:09.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:34:09.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:34:09.826 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:34:09.826 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:34:09.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:34:09.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:34:09.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:34:09.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:34:09.827 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:34:09.827 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:34:09.827 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:34:10.082 143787 DEBUG oslo_service.periodic_task [req-eaa62e3a-fbb5-4efe-86c9-3d9e95e923ba - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:34:10.087 143787 DEBUG oslo_concurrency.lockutils [req-fb3a902c-657a-46a0-b03b-53cca0474829 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:34:10.087 143781 DEBUG oslo_service.periodic_task [req-6bebbe83-0225-4cb4-95a1-a1c57b073338 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:34:10.087 143787 DEBUG oslo_concurrency.lockutils [req-fb3a902c-657a-46a0-b03b-53cca0474829 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:34:10.091 143781 DEBUG oslo_concurrency.lockutils [req-ee3b5dbc-ef0a-4d39-b3a4-6101ff8778c7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:34:10.091 143781 DEBUG oslo_concurrency.lockutils [req-ee3b5dbc-ef0a-4d39-b3a4-6101ff8778c7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:34:25.826 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:34:25.827 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:34:25.827 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:34:25.828 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:34:25.828 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:34:25.828 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:34:25.828 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:34:25.829 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:34:25.829 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:34:25.829 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:34:25.829 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:34:25.829 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:34:29.058 143779 DEBUG oslo_service.periodic_task [req-be5db46b-72e8-4667-95c4-6b6ac653d5b5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:34:29.062 143779 DEBUG oslo_concurrency.lockutils [req-211da78c-df1d-40ef-8c92-7964a2117f75 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:34:29.062 143779 DEBUG oslo_concurrency.lockutils [req-211da78c-df1d-40ef-8c92-7964a2117f75 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:34:30.044 143780 DEBUG oslo_service.periodic_task [req-6a57d3f8-36fb-4a8b-a851-b183d3759a2a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:34:30.048 143780 DEBUG oslo_concurrency.lockutils [req-c7c7d62f-ee14-4ec0-a70e-3e4e4525a97b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:34:30.049 143780 DEBUG oslo_concurrency.lockutils [req-c7c7d62f-ee14-4ec0-a70e-3e4e4525a97b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:34:41.041 143781 DEBUG oslo_service.periodic_task [req-ee3b5dbc-ef0a-4d39-b3a4-6101ff8778c7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:34:41.045 143781 DEBUG oslo_concurrency.lockutils [req-be5067fa-e727-4d42-b821-e05dfc435dc1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:34:41.046 143781 DEBUG oslo_concurrency.lockutils [req-be5067fa-e727-4d42-b821-e05dfc435dc1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:34:41.083 143787 DEBUG oslo_service.periodic_task [req-fb3a902c-657a-46a0-b03b-53cca0474829 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:34:41.087 143787 DEBUG oslo_concurrency.lockutils [req-616ab0cd-a30d-4ba1-9112-3046981d5351 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:34:41.087 143787 DEBUG oslo_concurrency.lockutils [req-616ab0cd-a30d-4ba1-9112-3046981d5351 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:34:57.829 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:34:57.829 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:34:57.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:34:57.830 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:34:57.831 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:34:57.831 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:34:57.831 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:34:57.831 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:34:57.831 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:34:57.832 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:34:57.832 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:34:57.832 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:34:59.069 143779 DEBUG oslo_service.periodic_task [req-211da78c-df1d-40ef-8c92-7964a2117f75 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:34:59.073 143779 DEBUG oslo_concurrency.lockutils [req-b7feef76-bd59-4cbb-bee6-14f4fa47a1d3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:34:59.073 143779 DEBUG oslo_concurrency.lockutils [req-b7feef76-bd59-4cbb-bee6-14f4fa47a1d3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:35:00.053 143780 DEBUG oslo_service.periodic_task [req-c7c7d62f-ee14-4ec0-a70e-3e4e4525a97b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:35:00.058 143780 DEBUG oslo_concurrency.lockutils [req-376f8f6c-2b83-4d53-975c-7265fd043094 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:35:00.058 143780 DEBUG oslo_concurrency.lockutils [req-376f8f6c-2b83-4d53-975c-7265fd043094 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:35:12.042 143781 DEBUG oslo_service.periodic_task [req-be5067fa-e727-4d42-b821-e05dfc435dc1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:35:12.046 143781 DEBUG oslo_concurrency.lockutils [req-e4470319-0311-4cf2-8193-4f306dbe56fd - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:35:12.046 143781 DEBUG oslo_concurrency.lockutils [req-e4470319-0311-4cf2-8193-4f306dbe56fd - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:35:12.083 143787 DEBUG oslo_service.periodic_task [req-616ab0cd-a30d-4ba1-9112-3046981d5351 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:35:12.087 143787 DEBUG oslo_concurrency.lockutils [req-c772d72a-8165-4b71-94a0-8963b282d79d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:35:12.088 143787 DEBUG oslo_concurrency.lockutils [req-c772d72a-8165-4b71-94a0-8963b282d79d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:35:29.080 143779 DEBUG oslo_service.periodic_task [req-b7feef76-bd59-4cbb-bee6-14f4fa47a1d3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:35:29.084 143779 DEBUG oslo_concurrency.lockutils [req-59d71cc3-0d41-44d6-a1e5-a469300f54d7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:35:29.084 143779 DEBUG oslo_concurrency.lockutils [req-59d71cc3-0d41-44d6-a1e5-a469300f54d7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:35:30.065 143780 DEBUG oslo_service.periodic_task [req-376f8f6c-2b83-4d53-975c-7265fd043094 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:35:30.070 143780 DEBUG oslo_concurrency.lockutils [req-3adef9a4-08d8-4d4e-8dca-43af33d1eff4 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:35:30.070 143780 DEBUG oslo_concurrency.lockutils [req-3adef9a4-08d8-4d4e-8dca-43af33d1eff4 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:35:42.054 143781 DEBUG oslo_service.periodic_task [req-e4470319-0311-4cf2-8193-4f306dbe56fd - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:35:42.059 143781 DEBUG oslo_concurrency.lockutils [req-15205093-82f2-4e37-9d5f-f73e540129f4 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:35:42.059 143781 DEBUG oslo_concurrency.lockutils [req-15205093-82f2-4e37-9d5f-f73e540129f4 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:35:42.095 143787 DEBUG oslo_service.periodic_task [req-c772d72a-8165-4b71-94a0-8963b282d79d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:35:42.099 143787 DEBUG oslo_concurrency.lockutils [req-d3f8c1ed-8e89-4b0e-becc-737489b01c96 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:35:42.100 143787 DEBUG oslo_concurrency.lockutils [req-d3f8c1ed-8e89-4b0e-becc-737489b01c96 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:35:59.091 143779 DEBUG oslo_service.periodic_task [req-59d71cc3-0d41-44d6-a1e5-a469300f54d7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:35:59.096 143779 DEBUG oslo_concurrency.lockutils [req-36479afb-5474-44f2-9dff-2eaeac18108f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:35:59.096 143779 DEBUG oslo_concurrency.lockutils [req-36479afb-5474-44f2-9dff-2eaeac18108f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:35:59.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 012e594382ac4054b3d0f7dcab26d610 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:35:59.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 012e594382ac4054b3d0f7dcab26d610 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:35:59.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 012e594382ac4054b3d0f7dcab26d610 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:35:59.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 012e594382ac4054b3d0f7dcab26d610 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:35:59.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:35:59.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:35:59.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:35:59.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:35:59.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 012e594382ac4054b3d0f7dcab26d610 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:35:59.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 012e594382ac4054b3d0f7dcab26d610 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:35:59.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 012e594382ac4054b3d0f7dcab26d610 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:35:59.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 012e594382ac4054b3d0f7dcab26d610 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:35:59.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:35:59.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:35:59.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:35:59.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:35:59.814 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:35:59.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:35:59.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:35:59.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:35:59.814 143779 DEBUG oslo_concurrency.lockutils [req-cd1d993e-5858-4f1f-a0b1-fb1099c26e55 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:35:59.814 143787 DEBUG oslo_concurrency.lockutils [req-cd1d993e-5858-4f1f-a0b1-fb1099c26e55 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:35:59.814 143780 DEBUG oslo_concurrency.lockutils [req-cd1d993e-5858-4f1f-a0b1-fb1099c26e55 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:35:59.814 143781 DEBUG oslo_concurrency.lockutils [req-cd1d993e-5858-4f1f-a0b1-fb1099c26e55 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:35:59.814 143779 DEBUG nova.scheduler.host_manager [req-cd1d993e-5858-4f1f-a0b1-fb1099c26e55 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:35:59.814 143787 DEBUG nova.scheduler.host_manager [req-cd1d993e-5858-4f1f-a0b1-fb1099c26e55 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:35:59.815 143780 DEBUG nova.scheduler.host_manager [req-cd1d993e-5858-4f1f-a0b1-fb1099c26e55 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:35:59.815 143787 DEBUG oslo_concurrency.lockutils [req-cd1d993e-5858-4f1f-a0b1-fb1099c26e55 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:35:59.815 143779 DEBUG oslo_concurrency.lockutils [req-cd1d993e-5858-4f1f-a0b1-fb1099c26e55 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:35:59.815 143781 DEBUG nova.scheduler.host_manager [req-cd1d993e-5858-4f1f-a0b1-fb1099c26e55 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:35:59.815 143780 DEBUG oslo_concurrency.lockutils [req-cd1d993e-5858-4f1f-a0b1-fb1099c26e55 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:35:59.815 143781 DEBUG oslo_concurrency.lockutils [req-cd1d993e-5858-4f1f-a0b1-fb1099c26e55 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:35:59.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:35:59.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:35:59.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:35:59.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:35:59.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:35:59.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:35:59.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:35:59.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:35:59.816 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:35:59.816 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:35:59.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:35:59.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:36:00.075 143780 DEBUG oslo_service.periodic_task [req-3adef9a4-08d8-4d4e-8dca-43af33d1eff4 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:36:00.079 143780 DEBUG oslo_concurrency.lockutils [req-248a2ab6-d829-4732-a55d-4678d71b6218 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:36:00.079 143780 DEBUG oslo_concurrency.lockutils [req-248a2ab6-d829-4732-a55d-4678d71b6218 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:36:00.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:36:00.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:36:00.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:36:00.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:36:00.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:36:00.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:36:00.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:36:00.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:36:00.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:36:00.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:36:00.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:36:00.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:36:02.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:36:02.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:36:02.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:36:02.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:36:02.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:36:02.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:36:02.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:36:02.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:36:02.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:36:02.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:36:02.821 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:36:02.821 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:36:06.821 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:36:06.821 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:36:06.821 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:36:06.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:36:06.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:36:06.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:36:06.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:36:06.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:36:06.822 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:36:06.823 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:36:06.823 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:36:06.823 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:36:12.068 143781 DEBUG oslo_service.periodic_task [req-15205093-82f2-4e37-9d5f-f73e540129f4 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:36:12.072 143781 DEBUG oslo_concurrency.lockutils [req-9ffa177e-819d-49df-8f2e-660610926bd6 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:36:12.072 143781 DEBUG oslo_concurrency.lockutils [req-9ffa177e-819d-49df-8f2e-660610926bd6 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:36:13.084 143787 DEBUG oslo_service.periodic_task [req-d3f8c1ed-8e89-4b0e-becc-737489b01c96 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:36:13.089 143787 DEBUG oslo_concurrency.lockutils [req-3b559b45-bd59-4e82-a272-ad291946f5b0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:36:13.090 143787 DEBUG oslo_concurrency.lockutils [req-3b559b45-bd59-4e82-a272-ad291946f5b0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:36:14.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:36:14.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:36:14.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:36:14.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:36:14.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:36:14.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:36:14.826 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:36:14.827 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:36:14.827 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:36:14.829 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:36:14.829 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:36:14.830 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:36:30.057 143779 DEBUG oslo_service.periodic_task [req-36479afb-5474-44f2-9dff-2eaeac18108f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:36:30.061 143779 DEBUG oslo_concurrency.lockutils [req-0a285ad6-c971-48a1-8a87-cf6fc2f62599 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:36:30.062 143779 DEBUG oslo_concurrency.lockutils [req-0a285ad6-c971-48a1-8a87-cf6fc2f62599 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:36:30.826 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:36:30.826 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:36:30.826 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:36:30.827 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:36:30.827 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:36:30.827 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:36:30.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:36:30.829 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:36:30.829 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:36:30.831 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:36:30.831 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:36:30.831 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:36:31.044 143780 DEBUG oslo_service.periodic_task [req-248a2ab6-d829-4732-a55d-4678d71b6218 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:36:31.048 143780 DEBUG oslo_concurrency.lockutils [req-aa3f9352-aef8-4b4f-bf47-2d5ba2b66616 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:36:31.049 143780 DEBUG oslo_concurrency.lockutils [req-aa3f9352-aef8-4b4f-bf47-2d5ba2b66616 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:36:42.078 143781 DEBUG oslo_service.periodic_task [req-9ffa177e-819d-49df-8f2e-660610926bd6 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:36:42.082 143781 DEBUG oslo_concurrency.lockutils [req-6f03410b-bd85-47c1-9395-2b5f001350c2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:36:42.083 143781 DEBUG oslo_concurrency.lockutils [req-6f03410b-bd85-47c1-9395-2b5f001350c2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:36:44.084 143787 DEBUG oslo_service.periodic_task [req-3b559b45-bd59-4e82-a272-ad291946f5b0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:36:44.088 143787 DEBUG oslo_concurrency.lockutils [req-b909c399-7764-42ad-be16-ff68ae0e11d7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:36:44.088 143787 DEBUG oslo_concurrency.lockutils [req-b909c399-7764-42ad-be16-ff68ae0e11d7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:37:01.058 143779 DEBUG oslo_service.periodic_task [req-0a285ad6-c971-48a1-8a87-cf6fc2f62599 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:37:01.058 143780 DEBUG oslo_service.periodic_task [req-aa3f9352-aef8-4b4f-bf47-2d5ba2b66616 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:37:01.062 143779 DEBUG oslo_concurrency.lockutils [req-f3691cf7-361d-40e2-a5a3-77afaab20ef2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:37:01.062 143780 DEBUG oslo_concurrency.lockutils [req-90bbb8ce-4aa0-4066-a6fc-53d3a9d25a49 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:37:01.062 143780 DEBUG oslo_concurrency.lockutils [req-90bbb8ce-4aa0-4066-a6fc-53d3a9d25a49 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:37:01.062 143779 DEBUG oslo_concurrency.lockutils [req-f3691cf7-361d-40e2-a5a3-77afaab20ef2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:37:02.829 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:37:02.830 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:37:02.830 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:37:02.833 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:37:02.834 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:37:02.834 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:37:02.834 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:37:02.835 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:37:02.835 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:37:02.835 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:37:02.835 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:37:02.835 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:37:12.091 143781 DEBUG oslo_service.periodic_task [req-6f03410b-bd85-47c1-9395-2b5f001350c2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:37:12.095 143781 DEBUG oslo_concurrency.lockutils [req-2f1b26db-557c-4922-97a3-f17fe9d1ae90 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:37:12.095 143781 DEBUG oslo_concurrency.lockutils [req-2f1b26db-557c-4922-97a3-f17fe9d1ae90 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:37:15.083 143787 DEBUG oslo_service.periodic_task [req-b909c399-7764-42ad-be16-ff68ae0e11d7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:37:15.087 143787 DEBUG oslo_concurrency.lockutils [req-fb12227d-ed77-46d3-b2ba-d614e6a463a7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:37:15.088 143787 DEBUG oslo_concurrency.lockutils [req-fb12227d-ed77-46d3-b2ba-d614e6a463a7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:37:32.045 143780 DEBUG oslo_service.periodic_task [req-90bbb8ce-4aa0-4066-a6fc-53d3a9d25a49 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:37:32.049 143780 DEBUG oslo_concurrency.lockutils [req-0fb0c5f3-e775-4c8f-831a-3bded335e8a9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:37:32.050 143780 DEBUG oslo_concurrency.lockutils [req-0fb0c5f3-e775-4c8f-831a-3bded335e8a9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:37:32.057 143779 DEBUG oslo_service.periodic_task [req-f3691cf7-361d-40e2-a5a3-77afaab20ef2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:37:32.061 143779 DEBUG oslo_concurrency.lockutils [req-32d1d196-ea31-4e07-8156-455deeb43ada - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:37:32.062 143779 DEBUG oslo_concurrency.lockutils [req-32d1d196-ea31-4e07-8156-455deeb43ada - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:37:43.041 143781 DEBUG oslo_service.periodic_task [req-2f1b26db-557c-4922-97a3-f17fe9d1ae90 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:37:43.045 143781 DEBUG oslo_concurrency.lockutils [req-97f8f77d-135f-4b1a-a88e-4b734c952a11 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:37:43.046 143781 DEBUG oslo_concurrency.lockutils [req-97f8f77d-135f-4b1a-a88e-4b734c952a11 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:37:45.097 143787 DEBUG oslo_service.periodic_task [req-fb12227d-ed77-46d3-b2ba-d614e6a463a7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:37:45.102 143787 DEBUG oslo_concurrency.lockutils [req-4a0d2111-4eb8-4631-9e14-b18f207c6385 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:37:45.102 143787 DEBUG oslo_concurrency.lockutils [req-4a0d2111-4eb8-4631-9e14-b18f207c6385 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:38:00.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4830900e6d7e40a19ce35f5070a86e7a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:38:00.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4830900e6d7e40a19ce35f5070a86e7a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:38:00.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:00.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:00.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4830900e6d7e40a19ce35f5070a86e7a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:38:00.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4830900e6d7e40a19ce35f5070a86e7a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:38:00.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:00.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:00.812 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:00.812 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:00.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4830900e6d7e40a19ce35f5070a86e7a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:38:00.813 143779 DEBUG oslo_concurrency.lockutils [req-2e1cba4c-0d5c-4953-a457-66bb71baa7da - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:38:00.813 143780 DEBUG oslo_concurrency.lockutils [req-2e1cba4c-0d5c-4953-a457-66bb71baa7da - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:38:00.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:00.813 143779 DEBUG nova.scheduler.host_manager [req-2e1cba4c-0d5c-4953-a457-66bb71baa7da - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:38:00.813 143780 DEBUG nova.scheduler.host_manager [req-2e1cba4c-0d5c-4953-a457-66bb71baa7da - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:38:00.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4830900e6d7e40a19ce35f5070a86e7a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:38:00.813 143779 DEBUG oslo_concurrency.lockutils [req-2e1cba4c-0d5c-4953-a457-66bb71baa7da - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:38:00.813 143780 DEBUG oslo_concurrency.lockutils [req-2e1cba4c-0d5c-4953-a457-66bb71baa7da - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:38:00.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:00.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:38:00.814 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:00.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:38:00.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:00.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:00.814 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:00.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:00.814 143781 DEBUG oslo_concurrency.lockutils [req-2e1cba4c-0d5c-4953-a457-66bb71baa7da - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:38:00.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4830900e6d7e40a19ce35f5070a86e7a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:38:00.815 143781 DEBUG nova.scheduler.host_manager [req-2e1cba4c-0d5c-4953-a457-66bb71baa7da - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:38:00.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:00.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4830900e6d7e40a19ce35f5070a86e7a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:38:00.815 143781 DEBUG oslo_concurrency.lockutils [req-2e1cba4c-0d5c-4953-a457-66bb71baa7da - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:38:00.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:00.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:00.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:38:00.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:00.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:00.816 143787 DEBUG oslo_concurrency.lockutils [req-2e1cba4c-0d5c-4953-a457-66bb71baa7da - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:38:00.816 143787 DEBUG nova.scheduler.host_manager [req-2e1cba4c-0d5c-4953-a457-66bb71baa7da - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:38:00.816 143787 DEBUG oslo_concurrency.lockutils [req-2e1cba4c-0d5c-4953-a457-66bb71baa7da - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:38:00.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:38:00.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:00.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:01.815 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:38:01.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:01.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:01.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:38:01.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:01.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:01.819 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:38:01.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:01.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:01.826 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:38:01.826 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:01.827 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:02.055 143780 DEBUG oslo_service.periodic_task [req-0fb0c5f3-e775-4c8f-831a-3bded335e8a9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:38:02.059 143780 DEBUG oslo_concurrency.lockutils [req-a8f2461c-fbe2-4ebe-9d9c-b29ff5edd2cc - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:38:02.059 143780 DEBUG oslo_concurrency.lockutils [req-a8f2461c-fbe2-4ebe-9d9c-b29ff5edd2cc - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:38:03.057 143779 DEBUG oslo_service.periodic_task [req-32d1d196-ea31-4e07-8156-455deeb43ada - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:38:03.062 143779 DEBUG oslo_concurrency.lockutils [req-083a95da-f385-4b4f-8639-e05ef3c639f5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:38:03.062 143779 DEBUG oslo_concurrency.lockutils [req-083a95da-f385-4b4f-8639-e05ef3c639f5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:38:03.818 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:38:03.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:03.819 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:03.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:38:03.821 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:03.821 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:03.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:38:03.823 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:03.823 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:03.828 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:38:03.829 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:03.829 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:07.821 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:38:07.821 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:07.821 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:07.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:38:07.823 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:07.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:07.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:38:07.826 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:07.826 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:07.831 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:38:07.832 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:07.832 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:14.042 143781 DEBUG oslo_service.periodic_task [req-97f8f77d-135f-4b1a-a88e-4b734c952a11 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:38:14.059 143781 DEBUG oslo_concurrency.lockutils [req-21ec148c-2d2a-4e16-a67b-91b6ea32bf37 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:38:14.069 143781 DEBUG oslo_concurrency.lockutils [req-21ec148c-2d2a-4e16-a67b-91b6ea32bf37 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.011s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:38:15.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:38:15.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:15.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:15.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:38:15.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:15.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:15.827 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:38:15.828 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:15.828 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:15.837 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:38:15.838 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:15.838 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:16.082 143787 DEBUG oslo_service.periodic_task [req-4a0d2111-4eb8-4631-9e14-b18f207c6385 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:38:16.087 143787 DEBUG oslo_concurrency.lockutils [req-0286f137-abb1-4832-8e90-a0e88d01b907 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:38:16.093 143787 DEBUG oslo_concurrency.lockutils [req-0286f137-abb1-4832-8e90-a0e88d01b907 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.007s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:38:31.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:38:31.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:31.825 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:31.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:38:31.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:31.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:31.829 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:38:31.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:31.830 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:31.840 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:38:31.840 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:38:31.840 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:38:33.045 143780 DEBUG oslo_service.periodic_task [req-a8f2461c-fbe2-4ebe-9d9c-b29ff5edd2cc - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:38:33.049 143780 DEBUG oslo_concurrency.lockutils [req-a6084ad9-1d92-47b5-9335-7fa33c7efd00 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:38:33.049 143780 DEBUG oslo_concurrency.lockutils [req-a6084ad9-1d92-47b5-9335-7fa33c7efd00 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:38:34.057 143779 DEBUG oslo_service.periodic_task [req-083a95da-f385-4b4f-8639-e05ef3c639f5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:38:34.062 143779 DEBUG oslo_concurrency.lockutils [req-e95ce069-fbaf-4e39-946e-3b76c773671d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:38:34.062 143779 DEBUG oslo_concurrency.lockutils [req-e95ce069-fbaf-4e39-946e-3b76c773671d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:38:44.080 143781 DEBUG oslo_service.periodic_task [req-21ec148c-2d2a-4e16-a67b-91b6ea32bf37 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:38:44.084 143781 DEBUG oslo_concurrency.lockutils [req-8dcc4963-39bf-45de-8bff-500cf529074c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:38:44.085 143781 DEBUG oslo_concurrency.lockutils [req-8dcc4963-39bf-45de-8bff-500cf529074c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:38:47.083 143787 DEBUG oslo_service.periodic_task [req-0286f137-abb1-4832-8e90-a0e88d01b907 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:38:47.088 143787 DEBUG oslo_concurrency.lockutils [req-4795d32f-5995-45ea-9cc2-8375bdc9b249 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:38:47.089 143787 DEBUG oslo_concurrency.lockutils [req-4795d32f-5995-45ea-9cc2-8375bdc9b249 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:39:03.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:39:03.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:39:03.827 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:39:03.836 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:39:03.836 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:39:03.836 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:39:03.839 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:39:03.839 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:39:03.839 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:39:03.849 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:39:03.850 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:39:03.850 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:39:04.044 143780 DEBUG oslo_service.periodic_task [req-a6084ad9-1d92-47b5-9335-7fa33c7efd00 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:39:04.048 143780 DEBUG oslo_concurrency.lockutils [req-c2bc7271-ee4a-4d89-957e-6883ff7374f6 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:39:04.048 143780 DEBUG oslo_concurrency.lockutils [req-c2bc7271-ee4a-4d89-957e-6883ff7374f6 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:39:04.068 143779 DEBUG oslo_service.periodic_task [req-e95ce069-fbaf-4e39-946e-3b76c773671d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:39:04.072 143779 DEBUG oslo_concurrency.lockutils [req-01daeaee-24b4-4778-bcdd-dac30294bb87 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:39:04.072 143779 DEBUG oslo_concurrency.lockutils [req-01daeaee-24b4-4778-bcdd-dac30294bb87 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:39:14.093 143781 DEBUG oslo_service.periodic_task [req-8dcc4963-39bf-45de-8bff-500cf529074c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:39:14.099 143781 DEBUG oslo_concurrency.lockutils [req-c9e364de-31bf-4d73-a899-189dfb522c2b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:39:14.099 143781 DEBUG oslo_concurrency.lockutils [req-c9e364de-31bf-4d73-a899-189dfb522c2b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:39:18.083 143787 DEBUG oslo_service.periodic_task [req-4795d32f-5995-45ea-9cc2-8375bdc9b249 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:39:18.087 143787 DEBUG oslo_concurrency.lockutils [req-40a1c22e-7991-49be-ad43-3c945c2eb853 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:39:18.087 143787 DEBUG oslo_concurrency.lockutils [req-40a1c22e-7991-49be-ad43-3c945c2eb853 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:39:34.085 143779 DEBUG oslo_service.periodic_task [req-01daeaee-24b4-4778-bcdd-dac30294bb87 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:39:34.090 143779 DEBUG oslo_concurrency.lockutils [req-7f8eb86e-7c58-45a0-ac5a-c34664b0bd42 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:39:34.090 143779 DEBUG oslo_concurrency.lockutils [req-7f8eb86e-7c58-45a0-ac5a-c34664b0bd42 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:39:35.045 143780 DEBUG oslo_service.periodic_task [req-c2bc7271-ee4a-4d89-957e-6883ff7374f6 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:39:35.050 143780 DEBUG oslo_concurrency.lockutils [req-a23185cb-a687-4007-ab33-587b051510b2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:39:35.051 143780 DEBUG oslo_concurrency.lockutils [req-a23185cb-a687-4007-ab33-587b051510b2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:39:44.109 143781 DEBUG oslo_service.periodic_task [req-c9e364de-31bf-4d73-a899-189dfb522c2b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:39:44.113 143781 DEBUG oslo_concurrency.lockutils [req-1f8c2e92-0b2d-46e3-98b5-38bc5442f5ab - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:39:44.114 143781 DEBUG oslo_concurrency.lockutils [req-1f8c2e92-0b2d-46e3-98b5-38bc5442f5ab - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:39:49.083 143787 DEBUG oslo_service.periodic_task [req-40a1c22e-7991-49be-ad43-3c945c2eb853 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:39:49.089 143787 DEBUG oslo_concurrency.lockutils [req-75d591cf-3122-486e-88c2-24ea35d73154 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:39:49.089 143787 DEBUG oslo_concurrency.lockutils [req-75d591cf-3122-486e-88c2-24ea35d73154 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:40:03.962 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 74decd5783f04fd880a578f7b30612f5 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:40:03.962 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 74decd5783f04fd880a578f7b30612f5 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:40:03.962 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 74decd5783f04fd880a578f7b30612f5 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:40:03.962 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:03.962 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:03.962 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:03.962 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 74decd5783f04fd880a578f7b30612f5 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:40:03.962 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 74decd5783f04fd880a578f7b30612f5 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:40:03.962 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 74decd5783f04fd880a578f7b30612f5 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:40:03.962 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:03.962 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:03.962 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:03.962 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:03.962 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:03.963 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:03.963 143781 DEBUG oslo_concurrency.lockutils [req-1813fdfe-01f2-4c8e-9415-1b2f3868c501 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:40:03.963 143780 DEBUG oslo_concurrency.lockutils [req-1813fdfe-01f2-4c8e-9415-1b2f3868c501 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:40:03.963 143787 DEBUG oslo_concurrency.lockutils [req-1813fdfe-01f2-4c8e-9415-1b2f3868c501 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:40:03.963 143780 DEBUG nova.scheduler.host_manager [req-1813fdfe-01f2-4c8e-9415-1b2f3868c501 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:40:03.963 143787 DEBUG nova.scheduler.host_manager [req-1813fdfe-01f2-4c8e-9415-1b2f3868c501 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:40:03.963 143781 DEBUG nova.scheduler.host_manager [req-1813fdfe-01f2-4c8e-9415-1b2f3868c501 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:40:03.963 143780 DEBUG oslo_concurrency.lockutils [req-1813fdfe-01f2-4c8e-9415-1b2f3868c501 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:40:03.963 143781 DEBUG oslo_concurrency.lockutils [req-1813fdfe-01f2-4c8e-9415-1b2f3868c501 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:40:03.964 143787 DEBUG oslo_concurrency.lockutils [req-1813fdfe-01f2-4c8e-9415-1b2f3868c501 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:40:03.964 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:40:03.964 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:03.964 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:03.964 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 74decd5783f04fd880a578f7b30612f5 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:40:03.965 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:03.965 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 74decd5783f04fd880a578f7b30612f5 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:40:03.965 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:03.965 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:03.965 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:40:03.965 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:03.965 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:40:03.965 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:03.965 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:03.965 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:03.966 143779 DEBUG oslo_concurrency.lockutils [req-1813fdfe-01f2-4c8e-9415-1b2f3868c501 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:40:03.966 143779 DEBUG nova.scheduler.host_manager [req-1813fdfe-01f2-4c8e-9415-1b2f3868c501 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:40:03.966 143779 DEBUG oslo_concurrency.lockutils [req-1813fdfe-01f2-4c8e-9415-1b2f3868c501 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:40:03.967 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:40:03.967 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:03.967 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:04.965 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:40:04.965 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:04.965 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:04.966 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:40:04.966 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:04.966 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:04.967 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:40:04.967 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:04.967 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:04.969 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:40:04.969 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:04.969 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:05.057 143779 DEBUG oslo_service.periodic_task [req-7f8eb86e-7c58-45a0-ac5a-c34664b0bd42 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:40:05.058 143780 DEBUG oslo_service.periodic_task [req-a23185cb-a687-4007-ab33-587b051510b2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:40:05.060 143779 DEBUG oslo_concurrency.lockutils [req-52cce5f0-1c9f-4cf2-8e18-1bfedb53af17 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:40:05.060 143779 DEBUG oslo_concurrency.lockutils [req-52cce5f0-1c9f-4cf2-8e18-1bfedb53af17 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:40:05.062 143780 DEBUG oslo_concurrency.lockutils [req-427ac75c-310f-44f3-ad45-b3aa178769d7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:40:05.062 143780 DEBUG oslo_concurrency.lockutils [req-427ac75c-310f-44f3-ad45-b3aa178769d7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:40:06.968 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:40:06.968 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:06.968 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:06.968 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:40:06.968 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:06.969 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:06.969 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:40:06.970 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:06.970 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:06.971 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:40:06.971 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:06.971 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:10.972 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:40:10.972 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:10.972 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:10.972 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:40:10.973 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:10.973 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:10.973 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:40:10.973 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:40:10.973 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:10.974 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:10.973 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:10.974 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:14.120 143781 DEBUG oslo_service.periodic_task [req-1f8c2e92-0b2d-46e3-98b5-38bc5442f5ab - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:40:14.124 143781 DEBUG oslo_concurrency.lockutils [req-348591f7-6437-44d8-8cdc-f592ea858efa - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:40:14.124 143781 DEBUG oslo_concurrency.lockutils [req-348591f7-6437-44d8-8cdc-f592ea858efa - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:40:18.976 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:40:18.976 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:18.976 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:40:18.977 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:18.977 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:18.977 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:18.978 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:40:18.978 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:40:18.978 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:18.978 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:18.978 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:18.978 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:20.083 143787 DEBUG oslo_service.periodic_task [req-75d591cf-3122-486e-88c2-24ea35d73154 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:40:20.087 143787 DEBUG oslo_concurrency.lockutils [req-bb8af940-0bc4-4669-967f-24cb448c6bf3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:40:20.087 143787 DEBUG oslo_concurrency.lockutils [req-bb8af940-0bc4-4669-967f-24cb448c6bf3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:40:34.979 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:40:34.979 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:40:34.979 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:34.979 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:34.979 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:34.979 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:34.980 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:40:34.980 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:40:34.981 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:34.981 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:40:34.981 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:34.981 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:40:35.064 143779 DEBUG oslo_service.periodic_task [req-52cce5f0-1c9f-4cf2-8e18-1bfedb53af17 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:40:35.066 143780 DEBUG oslo_service.periodic_task [req-427ac75c-310f-44f3-ad45-b3aa178769d7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:40:35.069 143779 DEBUG oslo_concurrency.lockutils [req-e3927b06-5b1f-4b48-97ca-9a56f5a96fbe - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:40:35.069 143779 DEBUG oslo_concurrency.lockutils [req-e3927b06-5b1f-4b48-97ca-9a56f5a96fbe - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:40:35.070 143780 DEBUG oslo_concurrency.lockutils [req-cc7801ac-bcde-4cfd-9e27-49e5b28321a7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:40:35.070 143780 DEBUG oslo_concurrency.lockutils [req-cc7801ac-bcde-4cfd-9e27-49e5b28321a7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:40:45.042 143781 DEBUG oslo_service.periodic_task [req-348591f7-6437-44d8-8cdc-f592ea858efa - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:40:45.047 143781 DEBUG oslo_concurrency.lockutils [req-bd75f881-62fc-4aee-b658-ec5537f11d5f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:40:45.048 143781 DEBUG oslo_concurrency.lockutils [req-bd75f881-62fc-4aee-b658-ec5537f11d5f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:40:50.092 143787 DEBUG oslo_service.periodic_task [req-bb8af940-0bc4-4669-967f-24cb448c6bf3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:40:50.097 143787 DEBUG oslo_concurrency.lockutils [req-dbed7e1a-9dff-4714-b670-eb5d0ac5046a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:40:50.097 143787 DEBUG oslo_concurrency.lockutils [req-dbed7e1a-9dff-4714-b670-eb5d0ac5046a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:41:05.085 143780 DEBUG oslo_service.periodic_task [req-cc7801ac-bcde-4cfd-9e27-49e5b28321a7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:41:05.089 143780 DEBUG oslo_concurrency.lockutils [req-ee2f5969-2ac1-45d0-ba04-e3775dffa76f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:41:05.090 143780 DEBUG oslo_concurrency.lockutils [req-ee2f5969-2ac1-45d0-ba04-e3775dffa76f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:41:06.057 143779 DEBUG oslo_service.periodic_task [req-e3927b06-5b1f-4b48-97ca-9a56f5a96fbe - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:41:06.061 143779 DEBUG oslo_concurrency.lockutils [req-9e233fea-07ff-4fef-bfd1-0d06219c0e91 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:41:06.062 143779 DEBUG oslo_concurrency.lockutils [req-9e233fea-07ff-4fef-bfd1-0d06219c0e91 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:41:06.981 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:41:06.981 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:41:06.981 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:41:06.981 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:41:06.981 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:41:06.982 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:41:06.982 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:41:06.983 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:41:06.983 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:41:06.983 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:41:06.984 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:41:06.984 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:41:15.057 143781 DEBUG oslo_service.periodic_task [req-bd75f881-62fc-4aee-b658-ec5537f11d5f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:41:15.062 143781 DEBUG oslo_concurrency.lockutils [req-cae5cbc9-3f85-442d-b262-5ba3759207db - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:41:15.062 143781 DEBUG oslo_concurrency.lockutils [req-cae5cbc9-3f85-442d-b262-5ba3759207db - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:41:21.084 143787 DEBUG oslo_service.periodic_task [req-dbed7e1a-9dff-4714-b670-eb5d0ac5046a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:41:21.088 143787 DEBUG oslo_concurrency.lockutils [req-912076ee-675b-4e91-bbf5-316e2f3474a9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:41:21.088 143787 DEBUG oslo_concurrency.lockutils [req-912076ee-675b-4e91-bbf5-316e2f3474a9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:41:35.103 143780 DEBUG oslo_service.periodic_task [req-ee2f5969-2ac1-45d0-ba04-e3775dffa76f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:41:35.112 143780 DEBUG oslo_concurrency.lockutils [req-6d92bdfb-7089-48a5-9c8b-e63e794ba60e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:41:35.112 143780 DEBUG oslo_concurrency.lockutils [req-6d92bdfb-7089-48a5-9c8b-e63e794ba60e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:41:36.075 143779 DEBUG oslo_service.periodic_task [req-9e233fea-07ff-4fef-bfd1-0d06219c0e91 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:41:36.080 143779 DEBUG oslo_concurrency.lockutils [req-1d74aa7b-46c2-4639-9eda-71e52d6b80d7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:41:36.080 143779 DEBUG oslo_concurrency.lockutils [req-1d74aa7b-46c2-4639-9eda-71e52d6b80d7 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:41:45.073 143781 DEBUG oslo_service.periodic_task [req-cae5cbc9-3f85-442d-b262-5ba3759207db - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:41:45.080 143781 DEBUG oslo_concurrency.lockutils [req-7491087f-15c5-4ae6-9908-2a7a3a7e1236 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:41:45.081 143781 DEBUG oslo_concurrency.lockutils [req-7491087f-15c5-4ae6-9908-2a7a3a7e1236 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:41:51.097 143787 DEBUG oslo_service.periodic_task [req-912076ee-675b-4e91-bbf5-316e2f3474a9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:41:51.102 143787 DEBUG oslo_concurrency.lockutils [req-8d3131e4-b896-44f9-8107-58923626876c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:41:51.102 143787 DEBUG oslo_concurrency.lockutils [req-8d3131e4-b896-44f9-8107-58923626876c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:42:06.044 143780 DEBUG oslo_service.periodic_task [req-6d92bdfb-7089-48a5-9c8b-e63e794ba60e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:42:06.048 143780 DEBUG oslo_concurrency.lockutils [req-7d60cb0a-7b41-4998-bc08-ea875388fa98 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:42:06.048 143780 DEBUG oslo_concurrency.lockutils [req-7d60cb0a-7b41-4998-bc08-ea875388fa98 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:42:06.094 143779 DEBUG oslo_service.periodic_task [req-1d74aa7b-46c2-4639-9eda-71e52d6b80d7 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:42:06.100 143779 DEBUG oslo_concurrency.lockutils [req-b0fe58dc-9db9-4431-9958-d0d78f24c6fb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:42:06.101 143779 DEBUG oslo_concurrency.lockutils [req-b0fe58dc-9db9-4431-9958-d0d78f24c6fb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:42:16.041 143781 DEBUG oslo_service.periodic_task [req-7491087f-15c5-4ae6-9908-2a7a3a7e1236 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:42:16.045 143781 DEBUG oslo_concurrency.lockutils [req-7700f02e-7e65-4857-97a7-5d25789920da - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:42:16.045 143781 DEBUG oslo_concurrency.lockutils [req-7700f02e-7e65-4857-97a7-5d25789920da - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:42:22.083 143787 DEBUG oslo_service.periodic_task [req-8d3131e4-b896-44f9-8107-58923626876c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:42:22.088 143787 DEBUG oslo_concurrency.lockutils [req-8792c57f-3ed6-46d6-9d9f-83998d13ded3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:42:22.088 143787 DEBUG oslo_concurrency.lockutils [req-8792c57f-3ed6-46d6-9d9f-83998d13ded3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:42:23.147 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:42:23.147 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:42:23.147 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:42:23.153 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:42:23.154 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:42:23.154 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:42:23.165 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:42:23.166 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:42:23.166 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:42:23.189 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:42:23.190 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:42:23.190 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:42:36.055 143780 DEBUG oslo_service.periodic_task [req-7d60cb0a-7b41-4998-bc08-ea875388fa98 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:42:36.059 143780 DEBUG oslo_concurrency.lockutils [req-81640a8b-07cc-47a3-a5cf-048e0c10454c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:42:36.059 143780 DEBUG oslo_concurrency.lockutils [req-81640a8b-07cc-47a3-a5cf-048e0c10454c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:42:36.116 143779 DEBUG oslo_service.periodic_task [req-b0fe58dc-9db9-4431-9958-d0d78f24c6fb - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:42:36.120 143779 DEBUG oslo_concurrency.lockutils [req-6cbc7274-19c9-48f1-9f33-745ff7f88eb0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:42:36.121 143779 DEBUG oslo_concurrency.lockutils [req-6cbc7274-19c9-48f1-9f33-745ff7f88eb0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:42:47.041 143781 DEBUG oslo_service.periodic_task [req-7700f02e-7e65-4857-97a7-5d25789920da - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:42:47.047 143781 DEBUG oslo_concurrency.lockutils [req-22420fb7-cb9f-4f80-a8ff-802af0ad6fee - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:42:47.047 143781 DEBUG oslo_concurrency.lockutils [req-22420fb7-cb9f-4f80-a8ff-802af0ad6fee - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:42:53.083 143787 DEBUG oslo_service.periodic_task [req-8792c57f-3ed6-46d6-9d9f-83998d13ded3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:42:53.087 143787 DEBUG oslo_concurrency.lockutils [req-1186a361-f360-4de4-b9f3-edd5aa4b1c88 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:42:53.088 143787 DEBUG oslo_concurrency.lockutils [req-1186a361-f360-4de4-b9f3-edd5aa4b1c88 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:43:06.065 143780 DEBUG oslo_service.periodic_task [req-81640a8b-07cc-47a3-a5cf-048e0c10454c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:43:06.071 143780 DEBUG oslo_concurrency.lockutils [req-9a370201-9d4b-4584-88a3-d2c77d0f0061 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:43:06.071 143780 DEBUG oslo_concurrency.lockutils [req-9a370201-9d4b-4584-88a3-d2c77d0f0061 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:43:07.057 143779 DEBUG oslo_service.periodic_task [req-6cbc7274-19c9-48f1-9f33-745ff7f88eb0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:43:07.062 143779 DEBUG oslo_concurrency.lockutils [req-07dc2a43-2b9c-489c-9e1b-dfc8ef12d098 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:43:07.062 143779 DEBUG oslo_concurrency.lockutils [req-07dc2a43-2b9c-489c-9e1b-dfc8ef12d098 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:43:17.055 143781 DEBUG oslo_service.periodic_task [req-22420fb7-cb9f-4f80-a8ff-802af0ad6fee - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:43:17.060 143781 DEBUG oslo_concurrency.lockutils [req-ea6d786c-dd42-41ac-b647-5cd58645b536 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:43:17.060 143781 DEBUG oslo_concurrency.lockutils [req-ea6d786c-dd42-41ac-b647-5cd58645b536 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:43:23.099 143787 DEBUG oslo_service.periodic_task [req-1186a361-f360-4de4-b9f3-edd5aa4b1c88 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:43:23.104 143787 DEBUG oslo_concurrency.lockutils [req-d1acc4cb-82f6-43e6-9efe-3f56080e3445 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:43:23.105 143787 DEBUG oslo_concurrency.lockutils [req-d1acc4cb-82f6-43e6-9efe-3f56080e3445 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:43:36.077 143780 DEBUG oslo_service.periodic_task [req-9a370201-9d4b-4584-88a3-d2c77d0f0061 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:43:36.082 143780 DEBUG oslo_concurrency.lockutils [req-8a4dbfdc-a334-465a-a8cd-e15f3fff33be - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:43:36.082 143780 DEBUG oslo_concurrency.lockutils [req-8a4dbfdc-a334-465a-a8cd-e15f3fff33be - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:43:37.069 143779 DEBUG oslo_service.periodic_task [req-07dc2a43-2b9c-489c-9e1b-dfc8ef12d098 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:43:37.073 143779 DEBUG oslo_concurrency.lockutils [req-41c0a8c3-c0f0-4d0e-a2b5-ebda7c8e4925 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:43:37.074 143779 DEBUG oslo_concurrency.lockutils [req-41c0a8c3-c0f0-4d0e-a2b5-ebda7c8e4925 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:43:48.041 143781 DEBUG oslo_service.periodic_task [req-ea6d786c-dd42-41ac-b647-5cd58645b536 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:43:48.045 143781 DEBUG oslo_concurrency.lockutils [req-3c5cb65e-5408-4f13-a62b-79d08c9aefb2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:43:48.046 143781 DEBUG oslo_concurrency.lockutils [req-3c5cb65e-5408-4f13-a62b-79d08c9aefb2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:43:53.117 143787 DEBUG oslo_service.periodic_task [req-d1acc4cb-82f6-43e6-9efe-3f56080e3445 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:43:53.124 143787 DEBUG oslo_concurrency.lockutils [req-b6b66bb3-7ffb-4d35-9a3c-a6cfefc3921b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:43:53.124 143787 DEBUG oslo_concurrency.lockutils [req-b6b66bb3-7ffb-4d35-9a3c-a6cfefc3921b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:44:06.088 143780 DEBUG oslo_service.periodic_task [req-8a4dbfdc-a334-465a-a8cd-e15f3fff33be - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:44:06.092 143780 DEBUG oslo_concurrency.lockutils [req-e1602208-7796-433e-9646-f0ca3a9acf8b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:44:06.092 143780 DEBUG oslo_concurrency.lockutils [req-e1602208-7796-433e-9646-f0ca3a9acf8b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:44:08.057 143779 DEBUG oslo_service.periodic_task [req-41c0a8c3-c0f0-4d0e-a2b5-ebda7c8e4925 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:44:08.062 143779 DEBUG oslo_concurrency.lockutils [req-00343a7e-e6cb-452c-aa54-51c18e2195fa - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:44:08.062 143779 DEBUG oslo_concurrency.lockutils [req-00343a7e-e6cb-452c-aa54-51c18e2195fa - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:44:14.782 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: efa3d85e3173451d9cb2dcb3950a45ad __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:44:14.782 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: efa3d85e3173451d9cb2dcb3950a45ad __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:44:14.782 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: efa3d85e3173451d9cb2dcb3950a45ad __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:44:14.783 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:14.783 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:14.783 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:14.783 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: efa3d85e3173451d9cb2dcb3950a45ad poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:44:14.783 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: efa3d85e3173451d9cb2dcb3950a45ad poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:44:14.783 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: efa3d85e3173451d9cb2dcb3950a45ad poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:44:14.783 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:14.783 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:14.783 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:14.783 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:14.783 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:14.783 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:14.784 143781 DEBUG oslo_concurrency.lockutils [req-7c86486e-2fe7-49ac-be7c-983d6c6e6880 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:44:14.784 143780 DEBUG oslo_concurrency.lockutils [req-7c86486e-2fe7-49ac-be7c-983d6c6e6880 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:44:14.784 143787 DEBUG oslo_concurrency.lockutils [req-7c86486e-2fe7-49ac-be7c-983d6c6e6880 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:44:14.784 143781 DEBUG nova.scheduler.host_manager [req-7c86486e-2fe7-49ac-be7c-983d6c6e6880 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:44:14.784 143780 DEBUG nova.scheduler.host_manager [req-7c86486e-2fe7-49ac-be7c-983d6c6e6880 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:44:14.784 143787 DEBUG nova.scheduler.host_manager [req-7c86486e-2fe7-49ac-be7c-983d6c6e6880 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:44:14.784 143781 DEBUG oslo_concurrency.lockutils [req-7c86486e-2fe7-49ac-be7c-983d6c6e6880 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:44:14.784 143780 DEBUG oslo_concurrency.lockutils [req-7c86486e-2fe7-49ac-be7c-983d6c6e6880 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:44:14.784 143787 DEBUG oslo_concurrency.lockutils [req-7c86486e-2fe7-49ac-be7c-983d6c6e6880 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:44:14.785 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:44:14.785 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:44:14.785 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:14.785 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:14.785 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:14.785 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:14.785 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: efa3d85e3173451d9cb2dcb3950a45ad __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:44:14.785 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:14.785 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: efa3d85e3173451d9cb2dcb3950a45ad poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:44:14.786 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:14.786 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:14.786 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:44:14.786 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:14.786 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:14.786 143779 DEBUG oslo_concurrency.lockutils [req-7c86486e-2fe7-49ac-be7c-983d6c6e6880 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:44:14.786 143779 DEBUG nova.scheduler.host_manager [req-7c86486e-2fe7-49ac-be7c-983d6c6e6880 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:44:14.787 143779 DEBUG oslo_concurrency.lockutils [req-7c86486e-2fe7-49ac-be7c-983d6c6e6880 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:44:14.788 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:44:14.788 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:14.788 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:15.786 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:44:15.786 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:44:15.787 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:15.787 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:15.787 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:15.787 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:15.787 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:44:15.787 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:15.788 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:15.790 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:44:15.790 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:15.790 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:17.789 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:44:17.789 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:17.789 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:17.789 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:44:17.790 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:17.790 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:17.790 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:44:17.790 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:17.790 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:17.792 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:44:17.793 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:17.793 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:18.051 143781 DEBUG oslo_service.periodic_task [req-3c5cb65e-5408-4f13-a62b-79d08c9aefb2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:44:18.057 143781 DEBUG oslo_concurrency.lockutils [req-cd49a760-0528-45c3-82da-da630bb75228 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:44:18.057 143781 DEBUG oslo_concurrency.lockutils [req-cd49a760-0528-45c3-82da-da630bb75228 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:44:21.790 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:44:21.791 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:21.791 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:21.792 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:44:21.792 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:21.793 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:21.793 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:44:21.794 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:21.794 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:21.797 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:44:21.798 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:21.798 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:23.131 143787 DEBUG oslo_service.periodic_task [req-b6b66bb3-7ffb-4d35-9a3c-a6cfefc3921b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:44:23.136 143787 DEBUG oslo_concurrency.lockutils [req-d1b9adf7-89f3-445b-bc86-292a10c39488 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:44:23.136 143787 DEBUG oslo_concurrency.lockutils [req-d1b9adf7-89f3-445b-bc86-292a10c39488 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:44:29.796 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:44:29.796 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:29.796 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:29.796 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:44:29.797 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:29.797 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:29.798 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:44:29.799 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:29.799 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:29.801 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:44:29.802 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:29.802 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:37.045 143780 DEBUG oslo_service.periodic_task [req-e1602208-7796-433e-9646-f0ca3a9acf8b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:44:37.050 143780 DEBUG oslo_concurrency.lockutils [req-a5ec9562-86f9-4947-8f06-53ceb446002b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:44:37.051 143780 DEBUG oslo_concurrency.lockutils [req-a5ec9562-86f9-4947-8f06-53ceb446002b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:44:39.057 143779 DEBUG oslo_service.periodic_task [req-00343a7e-e6cb-452c-aa54-51c18e2195fa - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:44:39.062 143779 DEBUG oslo_concurrency.lockutils [req-bdbbe052-a919-49ec-b204-63289cefe4c1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:44:39.062 143779 DEBUG oslo_concurrency.lockutils [req-bdbbe052-a919-49ec-b204-63289cefe4c1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:44:45.798 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:44:45.798 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:45.798 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:45.799 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:44:45.799 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:45.799 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:45.801 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:44:45.801 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:45.801 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:45.804 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:44:45.804 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:44:45.804 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:44:48.066 143781 DEBUG oslo_service.periodic_task [req-cd49a760-0528-45c3-82da-da630bb75228 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:44:48.070 143781 DEBUG oslo_concurrency.lockutils [req-d9c70f46-7c7c-41fd-94b7-d22487f831a6 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:44:48.070 143781 DEBUG oslo_concurrency.lockutils [req-d9c70f46-7c7c-41fd-94b7-d22487f831a6 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:44:53.145 143787 DEBUG oslo_service.periodic_task [req-d1b9adf7-89f3-445b-bc86-292a10c39488 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:44:53.150 143787 DEBUG oslo_concurrency.lockutils [req-344841fd-bb56-47fa-8ab9-591eb680a599 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:44:53.150 143787 DEBUG oslo_concurrency.lockutils [req-344841fd-bb56-47fa-8ab9-591eb680a599 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:45:07.058 143780 DEBUG oslo_service.periodic_task [req-a5ec9562-86f9-4947-8f06-53ceb446002b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:45:07.062 143780 DEBUG oslo_concurrency.lockutils [req-8294d204-10a4-4984-93f3-5f9da868700c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:45:07.063 143780 DEBUG oslo_concurrency.lockutils [req-8294d204-10a4-4984-93f3-5f9da868700c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:45:10.057 143779 DEBUG oslo_service.periodic_task [req-bdbbe052-a919-49ec-b204-63289cefe4c1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:45:10.063 143779 DEBUG oslo_concurrency.lockutils [req-67b5ea23-3c66-42c9-b3c5-4182526cb1a3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:45:10.063 143779 DEBUG oslo_concurrency.lockutils [req-67b5ea23-3c66-42c9-b3c5-4182526cb1a3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:45:19.041 143781 DEBUG oslo_service.periodic_task [req-d9c70f46-7c7c-41fd-94b7-d22487f831a6 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:45:19.045 143781 DEBUG oslo_concurrency.lockutils [req-80b664b9-60cc-438a-914f-311c13f50827 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:45:19.046 143781 DEBUG oslo_concurrency.lockutils [req-80b664b9-60cc-438a-914f-311c13f50827 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:45:23.149 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:45:23.150 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:45:23.150 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:45:23.158 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:45:23.158 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:45:23.159 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:45:23.161 143787 DEBUG oslo_service.periodic_task [req-344841fd-bb56-47fa-8ab9-591eb680a599 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:45:23.165 143787 DEBUG oslo_concurrency.lockutils [req-4da90b40-7774-49e3-9b7d-af1151975791 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:45:23.166 143787 DEBUG oslo_concurrency.lockutils [req-4da90b40-7774-49e3-9b7d-af1151975791 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:45:23.169 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:45:23.170 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:45:23.170 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:45:23.195 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:45:23.195 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:45:23.195 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:45:37.069 143780 DEBUG oslo_service.periodic_task [req-8294d204-10a4-4984-93f3-5f9da868700c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:45:37.074 143780 DEBUG oslo_concurrency.lockutils [req-6c168ca4-cb38-4e84-907e-db6fa547a070 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:45:37.074 143780 DEBUG oslo_concurrency.lockutils [req-6c168ca4-cb38-4e84-907e-db6fa547a070 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:45:40.069 143779 DEBUG oslo_service.periodic_task [req-67b5ea23-3c66-42c9-b3c5-4182526cb1a3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:45:40.073 143779 DEBUG oslo_concurrency.lockutils [req-98367af8-30b7-4f8c-a2e2-580a2303f97f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:45:40.074 143779 DEBUG oslo_concurrency.lockutils [req-98367af8-30b7-4f8c-a2e2-580a2303f97f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:45:50.041 143781 DEBUG oslo_service.periodic_task [req-80b664b9-60cc-438a-914f-311c13f50827 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:45:50.046 143781 DEBUG oslo_concurrency.lockutils [req-75d6466e-e5c2-4243-808e-9ca63035391b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:45:50.046 143781 DEBUG oslo_concurrency.lockutils [req-75d6466e-e5c2-4243-808e-9ca63035391b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:45:54.083 143787 DEBUG oslo_service.periodic_task [req-4da90b40-7774-49e3-9b7d-af1151975791 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:45:54.087 143787 DEBUG oslo_concurrency.lockutils [req-0062e87f-9a4e-4651-9b2b-a8b6cb9c5270 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:45:54.088 143787 DEBUG oslo_concurrency.lockutils [req-0062e87f-9a4e-4651-9b2b-a8b6cb9c5270 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:46:08.044 143780 DEBUG oslo_service.periodic_task [req-6c168ca4-cb38-4e84-907e-db6fa547a070 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:46:08.048 143780 DEBUG oslo_concurrency.lockutils [req-1199abd8-37f0-4524-8eeb-39b824c65836 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:46:08.048 143780 DEBUG oslo_concurrency.lockutils [req-1199abd8-37f0-4524-8eeb-39b824c65836 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:46:10.080 143779 DEBUG oslo_service.periodic_task [req-98367af8-30b7-4f8c-a2e2-580a2303f97f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:46:10.084 143779 DEBUG oslo_concurrency.lockutils [req-846434bc-fcf8-4ba5-bf7a-bfb95daf3e95 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:46:10.085 143779 DEBUG oslo_concurrency.lockutils [req-846434bc-fcf8-4ba5-bf7a-bfb95daf3e95 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:46:14.613 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 39549c39947144aeb0fc2d10b7694bae __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:46:14.613 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 39549c39947144aeb0fc2d10b7694bae __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:46:14.613 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 39549c39947144aeb0fc2d10b7694bae __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:46:14.613 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:14.613 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:14.614 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 39549c39947144aeb0fc2d10b7694bae poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:46:14.614 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:14.614 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 39549c39947144aeb0fc2d10b7694bae poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:46:14.614 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:14.614 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 39549c39947144aeb0fc2d10b7694bae poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:46:14.614 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:14.614 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:14.614 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:14.614 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:14.614 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:14.615 143781 DEBUG oslo_concurrency.lockutils [req-ab24f0a5-17d0-40d9-92ee-dbb4f2a36b32 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:46:14.615 143781 DEBUG nova.scheduler.host_manager [req-ab24f0a5-17d0-40d9-92ee-dbb4f2a36b32 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:46:14.615 143779 DEBUG oslo_concurrency.lockutils [req-ab24f0a5-17d0-40d9-92ee-dbb4f2a36b32 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:46:14.615 143781 DEBUG oslo_concurrency.lockutils [req-ab24f0a5-17d0-40d9-92ee-dbb4f2a36b32 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:46:14.615 143780 DEBUG oslo_concurrency.lockutils [req-ab24f0a5-17d0-40d9-92ee-dbb4f2a36b32 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:46:14.615 143779 DEBUG nova.scheduler.host_manager [req-ab24f0a5-17d0-40d9-92ee-dbb4f2a36b32 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:46:14.615 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:46:14.615 143780 DEBUG nova.scheduler.host_manager [req-ab24f0a5-17d0-40d9-92ee-dbb4f2a36b32 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:46:14.615 143779 DEBUG oslo_concurrency.lockutils [req-ab24f0a5-17d0-40d9-92ee-dbb4f2a36b32 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:46:14.616 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:14.616 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:14.616 143780 DEBUG oslo_concurrency.lockutils [req-ab24f0a5-17d0-40d9-92ee-dbb4f2a36b32 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:46:14.616 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:46:14.616 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:14.616 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:14.616 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:46:14.616 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:14.616 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:14.617 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 39549c39947144aeb0fc2d10b7694bae __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:46:14.618 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:14.618 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 39549c39947144aeb0fc2d10b7694bae poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:46:14.618 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:14.618 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:14.619 143787 DEBUG oslo_concurrency.lockutils [req-ab24f0a5-17d0-40d9-92ee-dbb4f2a36b32 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:46:14.619 143787 DEBUG nova.scheduler.host_manager [req-ab24f0a5-17d0-40d9-92ee-dbb4f2a36b32 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:46:14.620 143787 DEBUG oslo_concurrency.lockutils [req-ab24f0a5-17d0-40d9-92ee-dbb4f2a36b32 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:46:14.620 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:46:14.620 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:14.620 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:15.617 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:46:15.617 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:15.617 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:15.618 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:46:15.618 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:15.618 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:15.618 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:46:15.618 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:15.618 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:15.622 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:46:15.622 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:15.622 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:17.618 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:46:17.619 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:17.619 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:17.620 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:46:17.621 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:17.621 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:17.621 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:46:17.621 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:17.621 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:17.624 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:46:17.625 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:17.625 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:21.041 143781 DEBUG oslo_service.periodic_task [req-75d6466e-e5c2-4243-808e-9ca63035391b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:46:21.046 143781 DEBUG oslo_concurrency.lockutils [req-c8c88df5-11f6-4a86-a5e7-5152f49fcd28 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:46:21.046 143781 DEBUG oslo_concurrency.lockutils [req-c8c88df5-11f6-4a86-a5e7-5152f49fcd28 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:46:21.621 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:46:21.621 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:21.621 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:21.625 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:46:21.626 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:21.626 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:21.626 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:46:21.626 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:21.626 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:21.629 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:46:21.630 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:21.630 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:24.092 143787 DEBUG oslo_service.periodic_task [req-0062e87f-9a4e-4651-9b2b-a8b6cb9c5270 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:46:24.097 143787 DEBUG oslo_concurrency.lockutils [req-a658c08a-141f-40fb-b787-060494a51c16 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:46:24.098 143787 DEBUG oslo_concurrency.lockutils [req-a658c08a-141f-40fb-b787-060494a51c16 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:46:29.625 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:46:29.626 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:29.626 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:29.629 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:46:29.630 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:29.630 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:29.630 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:46:29.630 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:29.630 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:29.633 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:46:29.634 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:29.634 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:39.044 143780 DEBUG oslo_service.periodic_task [req-1199abd8-37f0-4524-8eeb-39b824c65836 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:46:39.049 143780 DEBUG oslo_concurrency.lockutils [req-095659ab-39cb-4bc8-a35b-575b1ccb0b85 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:46:39.049 143780 DEBUG oslo_concurrency.lockutils [req-095659ab-39cb-4bc8-a35b-575b1ccb0b85 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:46:41.058 143779 DEBUG oslo_service.periodic_task [req-846434bc-fcf8-4ba5-bf7a-bfb95daf3e95 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:46:41.061 143779 DEBUG oslo_concurrency.lockutils [req-59d3618d-87d2-4dd1-86fa-13787e84964b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:46:41.062 143779 DEBUG oslo_concurrency.lockutils [req-59d3618d-87d2-4dd1-86fa-13787e84964b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:46:45.628 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:46:45.628 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:45.628 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:45.631 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:46:45.632 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:45.632 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:45.632 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:46:45.633 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:45.633 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:45.635 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:46:45.635 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:46:45.635 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:46:51.055 143781 DEBUG oslo_service.periodic_task [req-c8c88df5-11f6-4a86-a5e7-5152f49fcd28 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:46:51.059 143781 DEBUG oslo_concurrency.lockutils [req-430327e4-0046-441e-add9-068063539988 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:46:51.059 143781 DEBUG oslo_concurrency.lockutils [req-430327e4-0046-441e-add9-068063539988 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:46:55.083 143787 DEBUG oslo_service.periodic_task [req-a658c08a-141f-40fb-b787-060494a51c16 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:46:55.088 143787 DEBUG oslo_concurrency.lockutils [req-98eb7153-586b-406b-8e8c-fdf499097637 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:46:55.088 143787 DEBUG oslo_concurrency.lockutils [req-98eb7153-586b-406b-8e8c-fdf499097637 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:47:09.056 143780 DEBUG oslo_service.periodic_task [req-095659ab-39cb-4bc8-a35b-575b1ccb0b85 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:47:09.060 143780 DEBUG oslo_concurrency.lockutils [req-a4a25a2e-2850-48a8-99fc-ec2eeb48cd59 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:47:09.060 143780 DEBUG oslo_concurrency.lockutils [req-a4a25a2e-2850-48a8-99fc-ec2eeb48cd59 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:47:11.070 143779 DEBUG oslo_service.periodic_task [req-59d3618d-87d2-4dd1-86fa-13787e84964b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:47:11.075 143779 DEBUG oslo_concurrency.lockutils [req-de12432f-12de-4232-ba90-6724cbf3419e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:47:11.075 143779 DEBUG oslo_concurrency.lockutils [req-de12432f-12de-4232-ba90-6724cbf3419e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:47:21.068 143781 DEBUG oslo_service.periodic_task [req-430327e4-0046-441e-add9-068063539988 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:47:21.074 143781 DEBUG oslo_concurrency.lockutils [req-8cba55a0-4196-42c9-bfd6-4ea335d50ecb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:47:21.074 143781 DEBUG oslo_concurrency.lockutils [req-8cba55a0-4196-42c9-bfd6-4ea335d50ecb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:47:23.154 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:47:23.154 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:47:23.154 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:47:23.168 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:47:23.169 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:47:23.169 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:47:23.177 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:47:23.178 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:47:23.178 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:47:23.207 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:47:23.207 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:47:23.207 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:47:26.083 143787 DEBUG oslo_service.periodic_task [req-98eb7153-586b-406b-8e8c-fdf499097637 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:47:26.087 143787 DEBUG oslo_concurrency.lockutils [req-ecdb1ee3-d3d3-4ec8-bb4f-48cf0ed6af9a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:47:26.087 143787 DEBUG oslo_concurrency.lockutils [req-ecdb1ee3-d3d3-4ec8-bb4f-48cf0ed6af9a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:47:40.045 143780 DEBUG oslo_service.periodic_task [req-a4a25a2e-2850-48a8-99fc-ec2eeb48cd59 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:47:40.051 143780 DEBUG oslo_concurrency.lockutils [req-7925147c-19c1-433e-bd63-2e9b80357c58 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:47:40.051 143780 DEBUG oslo_concurrency.lockutils [req-7925147c-19c1-433e-bd63-2e9b80357c58 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:47:42.057 143779 DEBUG oslo_service.periodic_task [req-de12432f-12de-4232-ba90-6724cbf3419e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:47:42.074 143779 DEBUG oslo_concurrency.lockutils [req-e2c32332-b358-4837-b725-2a6db1e0335f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:47:42.075 143779 DEBUG oslo_concurrency.lockutils [req-e2c32332-b358-4837-b725-2a6db1e0335f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:47:51.086 143781 DEBUG oslo_service.periodic_task [req-8cba55a0-4196-42c9-bfd6-4ea335d50ecb - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:47:51.091 143781 DEBUG oslo_concurrency.lockutils [req-d28544d6-4fef-40f0-983e-3d49d7dfa86b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:47:51.091 143781 DEBUG oslo_concurrency.lockutils [req-d28544d6-4fef-40f0-983e-3d49d7dfa86b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:47:56.093 143787 DEBUG oslo_service.periodic_task [req-ecdb1ee3-d3d3-4ec8-bb4f-48cf0ed6af9a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:47:56.097 143787 DEBUG oslo_concurrency.lockutils [req-1481ab0b-f161-4983-ba13-2780378253bd - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:47:56.098 143787 DEBUG oslo_concurrency.lockutils [req-1481ab0b-f161-4983-ba13-2780378253bd - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:48:10.069 143780 DEBUG oslo_service.periodic_task [req-7925147c-19c1-433e-bd63-2e9b80357c58 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:48:10.074 143780 DEBUG oslo_concurrency.lockutils [req-990df953-c3d9-476b-add9-89be9bda4ef2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:48:10.075 143780 DEBUG oslo_concurrency.lockutils [req-990df953-c3d9-476b-add9-89be9bda4ef2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:48:12.084 143779 DEBUG oslo_service.periodic_task [req-e2c32332-b358-4837-b725-2a6db1e0335f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:48:12.089 143779 DEBUG oslo_concurrency.lockutils [req-bc1f373e-d15f-433c-aae5-b2bc945f6719 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:48:12.089 143779 DEBUG oslo_concurrency.lockutils [req-bc1f373e-d15f-433c-aae5-b2bc945f6719 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:48:16.346 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c39bf32ed49e4703a4f502c489c766c8 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:48:16.346 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c39bf32ed49e4703a4f502c489c766c8 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:48:16.346 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c39bf32ed49e4703a4f502c489c766c8 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:48:16.346 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c39bf32ed49e4703a4f502c489c766c8 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:48:16.346 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:16.346 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c39bf32ed49e4703a4f502c489c766c8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:48:16.346 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:16.346 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:16.346 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c39bf32ed49e4703a4f502c489c766c8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:48:16.346 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:16.346 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:16.346 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c39bf32ed49e4703a4f502c489c766c8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:48:16.346 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:16.346 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c39bf32ed49e4703a4f502c489c766c8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:48:16.347 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:16.347 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:16.347 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:16.347 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:16.347 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:16.347 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:16.347 143779 DEBUG oslo_concurrency.lockutils [req-9f2a3628-6426-4c70-ade8-1d9c8d257db9 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:48:16.347 143779 DEBUG nova.scheduler.host_manager [req-9f2a3628-6426-4c70-ade8-1d9c8d257db9 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:48:16.347 143779 DEBUG oslo_concurrency.lockutils [req-9f2a3628-6426-4c70-ade8-1d9c8d257db9 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:48:16.347 143781 DEBUG oslo_concurrency.lockutils [req-9f2a3628-6426-4c70-ade8-1d9c8d257db9 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:48:16.347 143780 DEBUG oslo_concurrency.lockutils [req-9f2a3628-6426-4c70-ade8-1d9c8d257db9 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:48:16.348 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:48:16.348 143787 DEBUG oslo_concurrency.lockutils [req-9f2a3628-6426-4c70-ade8-1d9c8d257db9 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:48:16.348 143781 DEBUG nova.scheduler.host_manager [req-9f2a3628-6426-4c70-ade8-1d9c8d257db9 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:48:16.348 143780 DEBUG nova.scheduler.host_manager [req-9f2a3628-6426-4c70-ade8-1d9c8d257db9 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:48:16.348 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:16.348 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:16.348 143781 DEBUG oslo_concurrency.lockutils [req-9f2a3628-6426-4c70-ade8-1d9c8d257db9 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:48:16.348 143787 DEBUG nova.scheduler.host_manager [req-9f2a3628-6426-4c70-ade8-1d9c8d257db9 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:48:16.348 143780 DEBUG oslo_concurrency.lockutils [req-9f2a3628-6426-4c70-ade8-1d9c8d257db9 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:48:16.348 143787 DEBUG oslo_concurrency.lockutils [req-9f2a3628-6426-4c70-ade8-1d9c8d257db9 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:48:16.348 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:48:16.348 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:48:16.348 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:16.349 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:16.349 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:16.349 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:48:16.349 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:16.349 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:16.349 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:17.349 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:48:17.350 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:48:17.350 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:17.350 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:17.350 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:17.350 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:17.350 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:48:17.350 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:17.350 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:17.350 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:48:17.351 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:17.351 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:19.352 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:48:19.353 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:19.352 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:48:19.353 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:19.353 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:19.353 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:19.353 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:48:19.353 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:19.353 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:19.353 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:48:19.353 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:19.354 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:21.098 143781 DEBUG oslo_service.periodic_task [req-d28544d6-4fef-40f0-983e-3d49d7dfa86b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:48:21.102 143781 DEBUG oslo_concurrency.lockutils [req-5f05175e-5178-4899-866f-4a2658382780 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:48:21.102 143781 DEBUG oslo_concurrency.lockutils [req-5f05175e-5178-4899-866f-4a2658382780 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:48:23.354 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:48:23.354 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:23.354 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:23.354 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:48:23.354 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:48:23.355 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:23.355 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:23.355 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:23.355 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:23.355 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:48:23.355 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:23.355 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:27.083 143787 DEBUG oslo_service.periodic_task [req-1481ab0b-f161-4983-ba13-2780378253bd - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:48:27.087 143787 DEBUG oslo_concurrency.lockutils [req-06c18724-ebb1-449d-8040-18dd891a62c5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:48:27.088 143787 DEBUG oslo_concurrency.lockutils [req-06c18724-ebb1-449d-8040-18dd891a62c5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:48:31.357 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:48:31.357 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:31.357 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:31.358 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:48:31.359 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:31.359 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:31.360 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:48:31.360 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:31.361 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:31.362 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:48:31.362 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:31.362 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:41.045 143780 DEBUG oslo_service.periodic_task [req-990df953-c3d9-476b-add9-89be9bda4ef2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:48:41.049 143780 DEBUG oslo_concurrency.lockutils [req-c3c00859-572c-4b67-a522-7b02ce0cc4b0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:48:41.050 143780 DEBUG oslo_concurrency.lockutils [req-c3c00859-572c-4b67-a522-7b02ce0cc4b0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:48:43.057 143779 DEBUG oslo_service.periodic_task [req-bc1f373e-d15f-433c-aae5-b2bc945f6719 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:48:43.061 143779 DEBUG oslo_concurrency.lockutils [req-bbfed837-96b4-475e-a23f-a7700a8f6db0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:48:43.062 143779 DEBUG oslo_concurrency.lockutils [req-bbfed837-96b4-475e-a23f-a7700a8f6db0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:48:47.359 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:48:47.359 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:47.359 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:47.361 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:48:47.361 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:47.361 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:47.362 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:48:47.363 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:47.363 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:47.363 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:48:47.364 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:48:47.364 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:48:51.109 143781 DEBUG oslo_service.periodic_task [req-5f05175e-5178-4899-866f-4a2658382780 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:48:51.114 143781 DEBUG oslo_concurrency.lockutils [req-42814116-8be0-4321-8d94-55e977ff7ddf - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:48:51.114 143781 DEBUG oslo_concurrency.lockutils [req-42814116-8be0-4321-8d94-55e977ff7ddf - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:48:57.094 143787 DEBUG oslo_service.periodic_task [req-06c18724-ebb1-449d-8040-18dd891a62c5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:48:57.098 143787 DEBUG oslo_concurrency.lockutils [req-b05b3df2-5126-4b63-8bb2-65dbd07c9b44 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:48:57.099 143787 DEBUG oslo_concurrency.lockutils [req-b05b3df2-5126-4b63-8bb2-65dbd07c9b44 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:49:11.057 143780 DEBUG oslo_service.periodic_task [req-c3c00859-572c-4b67-a522-7b02ce0cc4b0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:49:11.061 143780 DEBUG oslo_concurrency.lockutils [req-d9565807-af99-4ec6-9904-8a482c54a305 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:49:11.061 143780 DEBUG oslo_concurrency.lockutils [req-d9565807-af99-4ec6-9904-8a482c54a305 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:49:13.070 143779 DEBUG oslo_service.periodic_task [req-bbfed837-96b4-475e-a23f-a7700a8f6db0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:49:13.074 143779 DEBUG oslo_concurrency.lockutils [req-6aaf557f-7040-48d5-9771-6f9b53326f6c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:49:13.075 143779 DEBUG oslo_concurrency.lockutils [req-6aaf557f-7040-48d5-9771-6f9b53326f6c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:49:21.122 143781 DEBUG oslo_service.periodic_task [req-42814116-8be0-4321-8d94-55e977ff7ddf - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:49:21.126 143781 DEBUG oslo_concurrency.lockutils [req-cf00683a-6fe5-4bfd-b34a-9c59836e5e47 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:49:21.126 143781 DEBUG oslo_concurrency.lockutils [req-cf00683a-6fe5-4bfd-b34a-9c59836e5e47 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:49:23.158 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:49:23.158 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:49:23.158 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:49:23.165 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:49:23.166 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:49:23.166 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:49:23.179 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:49:23.180 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:49:23.180 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:49:23.207 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:49:23.207 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:49:23.207 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:49:28.084 143787 DEBUG oslo_service.periodic_task [req-b05b3df2-5126-4b63-8bb2-65dbd07c9b44 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:49:28.088 143787 DEBUG oslo_concurrency.lockutils [req-5f02cd85-ce3c-4cf0-bec9-a8b1f3aa095b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:49:28.088 143787 DEBUG oslo_concurrency.lockutils [req-5f02cd85-ce3c-4cf0-bec9-a8b1f3aa095b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:49:42.044 143780 DEBUG oslo_service.periodic_task [req-d9565807-af99-4ec6-9904-8a482c54a305 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:49:42.048 143780 DEBUG oslo_concurrency.lockutils [req-8f35d2a7-8acc-4106-ae55-45d0046190ce - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:49:42.048 143780 DEBUG oslo_concurrency.lockutils [req-8f35d2a7-8acc-4106-ae55-45d0046190ce - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:49:44.057 143779 DEBUG oslo_service.periodic_task [req-6aaf557f-7040-48d5-9771-6f9b53326f6c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:49:44.062 143779 DEBUG oslo_concurrency.lockutils [req-06ad4e2e-9cf6-44da-ac8d-d2b370bee555 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:49:44.062 143779 DEBUG oslo_concurrency.lockutils [req-06ad4e2e-9cf6-44da-ac8d-d2b370bee555 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:49:51.133 143781 DEBUG oslo_service.periodic_task [req-cf00683a-6fe5-4bfd-b34a-9c59836e5e47 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:49:51.138 143781 DEBUG oslo_concurrency.lockutils [req-374cbb37-5c2b-476b-9945-5b57c9359546 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:49:51.138 143781 DEBUG oslo_concurrency.lockutils [req-374cbb37-5c2b-476b-9945-5b57c9359546 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:49:59.083 143787 DEBUG oslo_service.periodic_task [req-5f02cd85-ce3c-4cf0-bec9-a8b1f3aa095b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:49:59.087 143787 DEBUG oslo_concurrency.lockutils [req-9d38a615-09a2-44af-9073-ad916a8d4815 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:49:59.088 143787 DEBUG oslo_concurrency.lockutils [req-9d38a615-09a2-44af-9073-ad916a8d4815 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:50:12.055 143780 DEBUG oslo_service.periodic_task [req-8f35d2a7-8acc-4106-ae55-45d0046190ce - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:50:12.059 143780 DEBUG oslo_concurrency.lockutils [req-cdc08e94-f62c-41db-b968-42e7f4baec1c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:50:12.059 143780 DEBUG oslo_concurrency.lockutils [req-cdc08e94-f62c-41db-b968-42e7f4baec1c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:50:14.073 143779 DEBUG oslo_service.periodic_task [req-06ad4e2e-9cf6-44da-ac8d-d2b370bee555 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:50:14.077 143779 DEBUG oslo_concurrency.lockutils [req-f48f4ced-3796-403b-aaba-44947fbb8e1a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:50:14.077 143779 DEBUG oslo_concurrency.lockutils [req-f48f4ced-3796-403b-aaba-44947fbb8e1a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:50:15.741 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1301d4bc2f8143779e58a4b5e29e9527 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:50:15.741 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1301d4bc2f8143779e58a4b5e29e9527 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:50:15.741 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1301d4bc2f8143779e58a4b5e29e9527 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:50:15.741 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1301d4bc2f8143779e58a4b5e29e9527 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:50:15.741 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:15.741 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:15.741 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:15.741 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:15.741 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1301d4bc2f8143779e58a4b5e29e9527 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:50:15.741 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1301d4bc2f8143779e58a4b5e29e9527 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:50:15.741 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1301d4bc2f8143779e58a4b5e29e9527 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:50:15.741 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1301d4bc2f8143779e58a4b5e29e9527 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:50:15.741 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:15.741 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:15.741 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:15.741 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:15.741 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:15.741 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:15.741 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:15.741 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:15.742 143781 DEBUG oslo_concurrency.lockutils [req-f2c0410e-7f42-496f-8c25-42ec3684a3ff - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:50:15.742 143780 DEBUG oslo_concurrency.lockutils [req-f2c0410e-7f42-496f-8c25-42ec3684a3ff - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:50:15.742 143779 DEBUG oslo_concurrency.lockutils [req-f2c0410e-7f42-496f-8c25-42ec3684a3ff - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:50:15.742 143787 DEBUG oslo_concurrency.lockutils [req-f2c0410e-7f42-496f-8c25-42ec3684a3ff - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:50:15.742 143781 DEBUG nova.scheduler.host_manager [req-f2c0410e-7f42-496f-8c25-42ec3684a3ff - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:50:15.742 143780 DEBUG nova.scheduler.host_manager [req-f2c0410e-7f42-496f-8c25-42ec3684a3ff - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:50:15.742 143787 DEBUG nova.scheduler.host_manager [req-f2c0410e-7f42-496f-8c25-42ec3684a3ff - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:50:15.742 143779 DEBUG nova.scheduler.host_manager [req-f2c0410e-7f42-496f-8c25-42ec3684a3ff - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:50:15.742 143781 DEBUG oslo_concurrency.lockutils [req-f2c0410e-7f42-496f-8c25-42ec3684a3ff - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:50:15.742 143780 DEBUG oslo_concurrency.lockutils [req-f2c0410e-7f42-496f-8c25-42ec3684a3ff - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:50:15.743 143779 DEBUG oslo_concurrency.lockutils [req-f2c0410e-7f42-496f-8c25-42ec3684a3ff - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:50:15.743 143787 DEBUG oslo_concurrency.lockutils [req-f2c0410e-7f42-496f-8c25-42ec3684a3ff - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:50:15.743 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:15.743 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:15.743 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:15.743 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:15.743 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:15.743 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:15.744 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:15.744 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:15.744 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:15.744 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:15.744 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:15.744 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:16.744 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:16.744 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:16.744 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:16.744 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:16.745 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:16.745 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:16.745 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:16.745 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:16.746 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:16.746 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:16.746 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:16.746 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:18.747 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:18.747 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:18.747 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:18.747 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:18.747 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:18.747 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:18.748 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:18.748 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:18.748 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:18.748 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:18.749 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:18.749 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:21.145 143781 DEBUG oslo_service.periodic_task [req-374cbb37-5c2b-476b-9945-5b57c9359546 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:50:21.149 143781 DEBUG oslo_concurrency.lockutils [req-2d02166c-b07c-48c4-8ecc-625e5587ad4f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:50:21.150 143781 DEBUG oslo_concurrency.lockutils [req-2d02166c-b07c-48c4-8ecc-625e5587ad4f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:50:22.749 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:22.750 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:22.750 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:22.751 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:22.751 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:22.751 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:22.752 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:22.752 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:22.752 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:22.753 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:22.753 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:22.754 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:29.094 143787 DEBUG oslo_service.periodic_task [req-9d38a615-09a2-44af-9073-ad916a8d4815 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:50:29.099 143787 DEBUG oslo_concurrency.lockutils [req-18c83c79-679b-48a8-ac33-b918ed46d8b4 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:50:29.099 143787 DEBUG oslo_concurrency.lockutils [req-18c83c79-679b-48a8-ac33-b918ed46d8b4 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:50:30.752 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:30.753 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:30.753 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:30.756 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:30.756 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:30.756 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:30.756 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:30.756 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:30.756 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:30.760 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:30.760 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:30.760 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:42.071 143780 DEBUG oslo_service.periodic_task [req-cdc08e94-f62c-41db-b968-42e7f4baec1c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:50:42.075 143780 DEBUG oslo_concurrency.lockutils [req-db4d93a0-cea1-4c23-91ff-eba55631ebd5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:50:42.075 143780 DEBUG oslo_concurrency.lockutils [req-db4d93a0-cea1-4c23-91ff-eba55631ebd5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:50:45.058 143779 DEBUG oslo_service.periodic_task [req-f48f4ced-3796-403b-aaba-44947fbb8e1a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:50:45.063 143779 DEBUG oslo_concurrency.lockutils [req-90f78527-b858-4eb6-9f91-42591a79c241 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:50:45.063 143779 DEBUG oslo_concurrency.lockutils [req-90f78527-b858-4eb6-9f91-42591a79c241 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:50:46.755 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:46.755 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:46.755 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:46.757 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:46.757 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:46.758 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:46.758 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:46.758 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:46.758 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:46.761 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:46.761 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:46.761 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:51.157 143781 DEBUG oslo_service.periodic_task [req-2d02166c-b07c-48c4-8ecc-625e5587ad4f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:50:51.162 143781 DEBUG oslo_concurrency.lockutils [req-13a9bf4f-0367-4b8c-9145-ad13080912aa - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:50:51.162 143781 DEBUG oslo_concurrency.lockutils [req-13a9bf4f-0367-4b8c-9145-ad13080912aa - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:50:56.178 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 3ef4e73b5d6843579d6d4a1b2adbf9ca reply to reply_54bc7bb5014144dab053290b62b8003d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:50:56.178 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:56.178 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 99642e66117344d18dfc0d9b4e63325a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:50:56.179 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:56.179 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:56.181 143781 DEBUG nova.scheduler.manager [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['653b3b34-72ca-4b88-8ab1-886ad859d2f3'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:50:56.183 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:56.183 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:56.184 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:56.189 143781 DEBUG nova.scheduler.request_filter [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:50:56.189 143781 DEBUG nova.scheduler.request_filter [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:50:56.189 143781 DEBUG nova.scheduler.request_filter [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:50:56.189 143781 DEBUG nova.scheduler.request_filter [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:50:56.190 143781 DEBUG nova.scheduler.request_filter [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:50:56.194 143781 DEBUG oslo_concurrency.lockutils [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:50:56.194 143781 DEBUG oslo_concurrency.lockutils [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:50:56.648 143781 DEBUG oslo_messaging._drivers.amqpdriver [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: b249fb7f6b1e4169bfe7be8a460e17e8 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:50:56.652 143781 DEBUG oslo_concurrency.lockutils [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:50:56.653 143781 DEBUG oslo_concurrency.lockutils [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:50:56.664 143781 DEBUG oslo_concurrency.lockutils [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:50:56.665 143781 DEBUG nova.scheduler.host_manager [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:50:27Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:50:56.666 143781 DEBUG nova.scheduler.host_manager [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:50:56.667 143781 DEBUG nova.scheduler.host_manager [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 430, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 50, 51, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 50, 51, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:50:56.667 143781 DEBUG nova.scheduler.host_manager [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:50:56.667 143781 DEBUG oslo_concurrency.lockutils [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.003s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:50:56.667 143781 INFO nova.scheduler.host_manager [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:50:56.667 143781 DEBUG nova.scheduler.manager [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:50:56.668 143781 DEBUG nova.scheduler.manager [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:50:56.668 143781 DEBUG nova.scheduler.utils [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance 653b3b34-72ca-4b88-8ab1-886ad859d2f3 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:50:56.762 143781 DEBUG nova.scheduler.manager [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: 653b3b34-72ca-4b88-8ab1-886ad859d2f3] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:50:56.762 143781 DEBUG oslo_concurrency.lockutils [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:50:56.763 143781 DEBUG oslo_concurrency.lockutils [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:50:56.764 143781 DEBUG oslo_messaging._drivers.amqpdriver [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: b46ea332aa444e719cd255801d4470f3 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:50:56.769 143781 DEBUG oslo_messaging._drivers.amqpdriver [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: 3ef4e73b5d6843579d6d4a1b2adbf9ca reply queue: reply_54bc7bb5014144dab053290b62b8003d time elapsed: 0.5911187099991366s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:50:57.184 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:57.185 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:57.185 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:50:59.106 143787 DEBUG oslo_service.periodic_task [req-18c83c79-679b-48a8-ac33-b918ed46d8b4 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:50:59.112 143787 DEBUG oslo_concurrency.lockutils [req-573d3f9f-0a91-4bab-b73c-fcb839b1d9ea - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:50:59.112 143787 DEBUG oslo_concurrency.lockutils [req-573d3f9f-0a91-4bab-b73c-fcb839b1d9ea - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:50:59.187 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:50:59.188 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:50:59.188 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:03.190 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:03.191 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:03.191 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:04.376 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 6fea4e54da95496cac67b303ef1b9fac __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:51:04.376 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 6fea4e54da95496cac67b303ef1b9fac __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:51:04.376 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 6fea4e54da95496cac67b303ef1b9fac __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:51:04.376 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:04.376 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:04.376 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:04.376 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 6fea4e54da95496cac67b303ef1b9fac poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:04.376 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 6fea4e54da95496cac67b303ef1b9fac poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:04.376 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 6fea4e54da95496cac67b303ef1b9fac poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:04.376 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:04.376 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:04.376 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:04.376 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:04.376 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:04.377 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:04.377 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 6fea4e54da95496cac67b303ef1b9fac __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:51:04.378 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:04.379 143780 DEBUG oslo_concurrency.lockutils [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:04.379 143779 DEBUG oslo_concurrency.lockutils [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:04.379 143780 DEBUG oslo_concurrency.lockutils [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:04.379 143779 DEBUG oslo_concurrency.lockutils [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:04.380 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:04.380 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:04.380 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:04.380 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:04.380 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:04.380 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:04.380 143787 DEBUG oslo_concurrency.lockutils [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:04.381 143787 DEBUG oslo_concurrency.lockutils [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:04.378 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 6fea4e54da95496cac67b303ef1b9fac poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:04.381 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:04.381 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:04.381 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:04.381 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:04.382 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:04.384 143781 DEBUG oslo_concurrency.lockutils [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:04.384 143781 DEBUG oslo_concurrency.lockutils [req-3fb328ce-07d3-48be-94e8-ed5991d39957 bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:04.384 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:04.384 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:04.384 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:05.380 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:05.381 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:05.381 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:05.381 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:05.382 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:05.382 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:05.382 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:05.383 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:05.383 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:05.385 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:05.386 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:05.386 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:07.383 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:07.383 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:07.383 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:07.384 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:07.384 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:07.384 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:07.384 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:07.385 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:07.385 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:07.388 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:07.388 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:07.388 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:11.191 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 6375a188fcbf4632a215768e3e7c5377 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:51:11.191 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 6375a188fcbf4632a215768e3e7c5377 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:51:11.191 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 6375a188fcbf4632a215768e3e7c5377 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:51:11.191 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 6375a188fcbf4632a215768e3e7c5377 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:51:11.191 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:11.191 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:11.191 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:11.191 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 6375a188fcbf4632a215768e3e7c5377 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:11.191 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:11.191 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 6375a188fcbf4632a215768e3e7c5377 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:11.191 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 6375a188fcbf4632a215768e3e7c5377 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:11.191 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:11.191 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 6375a188fcbf4632a215768e3e7c5377 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:11.192 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:11.192 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:11.192 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:11.192 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:11.192 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:11.192 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:11.192 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:11.192 143781 DEBUG oslo_concurrency.lockutils [req-2f7f565a-4353-4ab2-a365-948fabde404e bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:11.192 143780 DEBUG oslo_concurrency.lockutils [req-2f7f565a-4353-4ab2-a365-948fabde404e bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:11.192 143779 DEBUG oslo_concurrency.lockutils [req-2f7f565a-4353-4ab2-a365-948fabde404e bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:11.192 143781 DEBUG oslo_concurrency.lockutils [req-2f7f565a-4353-4ab2-a365-948fabde404e bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:11.192 143780 DEBUG oslo_concurrency.lockutils [req-2f7f565a-4353-4ab2-a365-948fabde404e bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:11.192 143787 DEBUG oslo_concurrency.lockutils [req-2f7f565a-4353-4ab2-a365-948fabde404e bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:11.193 143779 DEBUG oslo_concurrency.lockutils [req-2f7f565a-4353-4ab2-a365-948fabde404e bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:11.193 143787 DEBUG oslo_concurrency.lockutils [req-2f7f565a-4353-4ab2-a365-948fabde404e bcbef5184772432685db16ae384a6e53 86132cccd0e24941a2a7595b00e45aa2 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:11.194 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:11.194 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:11.194 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:11.194 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:11.194 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:11.194 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:11.194 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:11.194 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:11.194 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:11.194 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:11.194 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:11.194 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:12.080 143780 DEBUG oslo_service.periodic_task [req-db4d93a0-cea1-4c23-91ff-eba55631ebd5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:51:12.085 143780 DEBUG oslo_concurrency.lockutils [req-25fa1d7b-3b17-4096-bf55-9a462ab7f4bf - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:12.085 143780 DEBUG oslo_concurrency.lockutils [req-25fa1d7b-3b17-4096-bf55-9a462ab7f4bf - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:12.195 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:12.195 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:12.195 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:12.196 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:12.196 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:12.196 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:12.196 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:12.196 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:12.196 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:12.196 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:12.196 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:12.196 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:14.197 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:14.198 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:14.198 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:14.198 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:14.198 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:14.198 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:14.198 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:14.198 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:14.198 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:14.198 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:14.198 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:14.199 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:15.069 143779 DEBUG oslo_service.periodic_task [req-90f78527-b858-4eb6-9f91-42591a79c241 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:51:15.073 143779 DEBUG oslo_concurrency.lockutils [req-2dc8e14d-5f1f-4bf4-9468-defaedf56190 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:15.073 143779 DEBUG oslo_concurrency.lockutils [req-2dc8e14d-5f1f-4bf4-9468-defaedf56190 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:18.199 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:18.200 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:18.200 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:18.200 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:18.200 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:18.200 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:18.201 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:18.201 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:18.201 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:18.201 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:18.201 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:18.201 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:21.169 143781 DEBUG oslo_service.periodic_task [req-13a9bf4f-0367-4b8c-9145-ad13080912aa - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:51:21.173 143781 DEBUG oslo_concurrency.lockutils [req-222f968e-44a9-4878-ac48-60ebfe9a7f58 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:21.173 143781 DEBUG oslo_concurrency.lockutils [req-222f968e-44a9-4878-ac48-60ebfe9a7f58 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:26.202 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:26.203 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:26.203 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:26.203 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:26.203 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:26.203 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:26.206 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:26.207 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:26.207 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:26.208 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:26.208 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:26.209 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:29.119 143787 DEBUG oslo_service.periodic_task [req-573d3f9f-0a91-4bab-b73c-fcb839b1d9ea - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:51:29.123 143787 DEBUG oslo_concurrency.lockutils [req-3bb273ae-decf-44a7-b905-83fdff74ee02 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:29.123 143787 DEBUG oslo_concurrency.lockutils [req-3bb273ae-decf-44a7-b905-83fdff74ee02 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:29.950 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 31099e5d5a914b48bb3a7da1bf60f851 reply to reply_54bc7bb5014144dab053290b62b8003d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:51:29.951 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:29.951 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: efffc2d9d12b4846b5bad4af3e4c4bf3 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:29.951 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:29.951 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:29.953 143780 DEBUG nova.scheduler.manager [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['85d87eda-9a86-429a-8ff8-09da00074e45'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:51:29.955 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:29.955 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:29.955 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:29.962 143780 DEBUG nova.scheduler.request_filter [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:51:29.963 143780 DEBUG nova.scheduler.request_filter [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:51:29.963 143780 DEBUG nova.scheduler.request_filter [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:51:29.963 143780 DEBUG nova.scheduler.request_filter [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:51:29.964 143780 DEBUG nova.scheduler.request_filter [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:51:29.968 143780 DEBUG oslo_concurrency.lockutils [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:29.968 143780 DEBUG oslo_concurrency.lockutils [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:30.398 143780 DEBUG oslo_messaging._drivers.amqpdriver [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: ef4f02d1c2dd444f926001cce05b1425 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:51:30.402 143780 DEBUG oslo_concurrency.lockutils [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:30.403 143780 DEBUG oslo_concurrency.lockutils [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:30.417 143780 DEBUG oslo_concurrency.lockutils [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:30.418 143780 DEBUG nova.scheduler.host_manager [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:51:28Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:51:30.420 143780 DEBUG nova.scheduler.host_manager [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:51:30.422 143780 DEBUG nova.scheduler.host_manager [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 433, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 51, 21, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 51, 21, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:51:30.422 143780 DEBUG nova.scheduler.host_manager [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:51:30.422 143780 DEBUG oslo_concurrency.lockutils [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.005s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:30.422 143780 INFO nova.scheduler.host_manager [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:51:30.423 143780 DEBUG nova.scheduler.manager [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:51:30.423 143780 DEBUG nova.scheduler.manager [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:51:30.423 143780 DEBUG nova.scheduler.utils [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance 85d87eda-9a86-429a-8ff8-09da00074e45 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:51:30.539 143780 DEBUG nova.scheduler.manager [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: 85d87eda-9a86-429a-8ff8-09da00074e45] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:51:30.540 143780 DEBUG oslo_concurrency.lockutils [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:30.541 143780 DEBUG oslo_concurrency.lockutils [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:30.542 143780 DEBUG oslo_messaging._drivers.amqpdriver [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: c33e86e055c54c71b57324c961d44ab4 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:51:30.548 143780 DEBUG oslo_messaging._drivers.amqpdriver [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: 31099e5d5a914b48bb3a7da1bf60f851 reply queue: reply_54bc7bb5014144dab053290b62b8003d time elapsed: 0.5972318960002667s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:51:30.957 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:30.957 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:30.958 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:32.093 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: b7dfe37c267d43f9ace3cb4decb7376d reply to reply_02dbf4ca539d4c419255761ef99731cf __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:51:32.093 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:32.093 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 017c621b3c374ccd8fce0a20fd0bc675 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:32.094 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:32.094 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:32.095 143779 DEBUG nova.scheduler.manager [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['ecd3bba1-64a2-4787-a9b5-90c7a2bb922a'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:51:32.098 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:32.098 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:32.098 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:32.107 143779 DEBUG nova.scheduler.request_filter [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:51:32.107 143779 DEBUG nova.scheduler.request_filter [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:51:32.108 143779 DEBUG nova.scheduler.request_filter [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:51:32.108 143779 DEBUG nova.scheduler.request_filter [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:51:32.109 143779 DEBUG nova.scheduler.request_filter [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:51:32.117 143779 DEBUG oslo_concurrency.lockutils [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:32.117 143779 DEBUG oslo_concurrency.lockutils [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:32.510 143779 DEBUG oslo_messaging._drivers.amqpdriver [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 018898ecc3834fd5a26365bf21ba3ea4 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:51:32.516 143779 DEBUG oslo_concurrency.lockutils [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:32.516 143779 DEBUG oslo_concurrency.lockutils [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:32.530 143779 DEBUG oslo_concurrency.lockutils [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:32.530 143779 DEBUG nova.scheduler.host_manager [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=76,free_ram_mb=30363,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=1,mapped=1,memory_mb=31899,memory_mb_used=1536,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=1,service_id=None,stats={failed_builds='0',io_workload='1',num_instances='1',num_os_type_None='1',num_proj_562ef31119d44e399fd3c90ec6589724='1',num_task_None='1',num_vm_building='1'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:51:31Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=1) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:51:32.532 143779 DEBUG nova.scheduler.host_manager [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:51:32.532 143779 DEBUG nova.scheduler.host_manager [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 434, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 51, 31, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 51, 31, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:51:32.532 143779 DEBUG nova.scheduler.host_manager [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:51:32.532 143779 DEBUG oslo_concurrency.lockutils [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.003s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:32.533 143779 INFO nova.scheduler.host_manager [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:51:32.533 143779 DEBUG nova.scheduler.manager [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 26624MB io_ops: 1 instances: 1]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:51:32.533 143779 DEBUG nova.scheduler.manager [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 26624MB io_ops: 1 instances: 1, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:51:32.534 143779 DEBUG nova.scheduler.utils [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance ecd3bba1-64a2-4787-a9b5-90c7a2bb922a claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:51:32.608 143779 DEBUG nova.scheduler.manager [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: ecd3bba1-64a2-4787-a9b5-90c7a2bb922a] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 26624MB io_ops: 1 instances: 1 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:51:32.609 143779 DEBUG oslo_concurrency.lockutils [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:32.609 143779 DEBUG oslo_concurrency.lockutils [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:32.611 143779 DEBUG oslo_messaging._drivers.amqpdriver [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 286bcb320ade4fceaf0fe0a926efe589 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:51:32.617 143779 DEBUG oslo_messaging._drivers.amqpdriver [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: b7dfe37c267d43f9ace3cb4decb7376d reply queue: reply_02dbf4ca539d4c419255761ef99731cf time elapsed: 0.5233739080003943s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:51:32.960 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:32.960 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:32.961 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:33.099 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:33.099 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:33.100 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:33.919 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7163261e2544447899f453707e871e35 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:51:33.919 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7163261e2544447899f453707e871e35 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:51:33.919 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7163261e2544447899f453707e871e35 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:51:33.919 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:33.919 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:33.919 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:33.919 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7163261e2544447899f453707e871e35 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:33.919 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7163261e2544447899f453707e871e35 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:33.919 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7163261e2544447899f453707e871e35 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:33.919 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:33.919 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:33.919 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:33.919 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:33.919 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:33.920 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:33.921 143787 DEBUG oslo_concurrency.lockutils [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:33.921 143781 DEBUG oslo_concurrency.lockutils [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:33.922 143787 DEBUG oslo_concurrency.lockutils [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:33.922 143781 DEBUG oslo_concurrency.lockutils [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:33.922 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:33.922 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:33.922 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 7163261e2544447899f453707e871e35 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:51:33.922 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:33.922 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:33.922 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:33.922 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:33.922 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:33.922 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 7163261e2544447899f453707e871e35 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:33.923 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:33.923 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:33.927 143779 DEBUG oslo_concurrency.lockutils [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:33.927 143780 DEBUG oslo_concurrency.lockutils [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:33.928 143780 DEBUG oslo_concurrency.lockutils [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:33.928 143779 DEBUG oslo_concurrency.lockutils [req-c1ea06bd-7efd-4c20-b961-88aa9442eb47 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:33.930 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:33.930 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:33.930 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:33.931 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:33.931 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:33.932 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:34.923 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:34.923 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:34.923 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:34.923 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:34.923 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:34.924 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:34.931 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:34.931 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:34.931 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:34.933 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:34.934 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:34.934 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:35.843 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 04109694a7f343578ce6b21e29ddb49b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:51:35.843 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 04109694a7f343578ce6b21e29ddb49b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:51:35.844 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:35.844 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 04109694a7f343578ce6b21e29ddb49b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:51:35.844 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 04109694a7f343578ce6b21e29ddb49b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:35.844 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 04109694a7f343578ce6b21e29ddb49b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:51:35.844 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:35.844 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 04109694a7f343578ce6b21e29ddb49b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:35.844 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:35.844 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:35.844 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:35.844 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 04109694a7f343578ce6b21e29ddb49b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:35.844 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:35.844 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:35.844 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:35.844 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:35.845 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:35.846 143787 DEBUG oslo_concurrency.lockutils [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:35.846 143779 DEBUG oslo_concurrency.lockutils [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:35.846 143779 DEBUG oslo_concurrency.lockutils [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:35.846 143787 DEBUG oslo_concurrency.lockutils [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:35.847 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:35.847 143781 DEBUG oslo_concurrency.lockutils [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:35.847 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:35.847 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:35.847 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:35.847 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:35.847 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:35.847 143781 DEBUG oslo_concurrency.lockutils [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:35.847 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:35.847 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:35.848 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:35.848 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 04109694a7f343578ce6b21e29ddb49b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:35.848 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:35.848 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:35.850 143780 DEBUG oslo_concurrency.lockutils [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:35.850 143780 DEBUG oslo_concurrency.lockutils [req-d03b9cad-8ff1-4dc5-982c-c54b7394f2e2 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:35.851 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:35.851 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:35.851 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:36.847 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:36.848 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:36.848 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:36.848 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:36.848 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:36.848 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:36.848 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:36.849 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:36.849 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:36.852 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:36.853 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:36.853 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:38.849 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:38.850 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:38.850 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:38.850 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:38.851 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:38.851 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:38.851 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:38.852 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:38.852 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:38.855 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:38.856 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:38.856 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:42.090 143780 DEBUG oslo_service.periodic_task [req-25fa1d7b-3b17-4096-bf55-9a462ab7f4bf - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:51:42.094 143780 DEBUG oslo_concurrency.lockutils [req-aea1951c-bcc7-485a-9a03-5bb79901bd8b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:42.094 143780 DEBUG oslo_concurrency.lockutils [req-aea1951c-bcc7-485a-9a03-5bb79901bd8b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:42.852 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:42.853 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:42.853 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:42.854 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:42.854 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:42.854 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:42.854 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:42.854 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:42.854 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:42.857 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:42.857 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:42.858 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:45.079 143779 DEBUG oslo_service.periodic_task [req-2dc8e14d-5f1f-4bf4-9468-defaedf56190 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:51:45.083 143779 DEBUG oslo_concurrency.lockutils [req-263b7112-41a7-44dc-a57b-403bce4d2b23 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:45.083 143779 DEBUG oslo_concurrency.lockutils [req-263b7112-41a7-44dc-a57b-403bce4d2b23 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:50.857 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:50.858 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:50.858 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:50.858 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:50.858 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:50.858 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:50.859 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:50.859 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:50.859 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:50.863 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:50.864 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:50.864 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:51.178 143781 DEBUG oslo_service.periodic_task [req-222f968e-44a9-4878-ac48-60ebfe9a7f58 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:51:51.181 143781 DEBUG oslo_concurrency.lockutils [req-f39617ca-d696-4c10-acd8-c39a49e9140a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:51.182 143781 DEBUG oslo_concurrency.lockutils [req-f39617ca-d696-4c10-acd8-c39a49e9140a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:53.547 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f9bbbdc1dbc848a9a210b311263cb09c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:51:53.547 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f9bbbdc1dbc848a9a210b311263cb09c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:51:53.547 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f9bbbdc1dbc848a9a210b311263cb09c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:51:53.547 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f9bbbdc1dbc848a9a210b311263cb09c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:51:53.547 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:53.547 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:53.547 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:53.547 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:53.548 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f9bbbdc1dbc848a9a210b311263cb09c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:53.548 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f9bbbdc1dbc848a9a210b311263cb09c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:53.548 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f9bbbdc1dbc848a9a210b311263cb09c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:53.548 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f9bbbdc1dbc848a9a210b311263cb09c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:53.548 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:53.548 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:53.548 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:53.548 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:53.548 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:53.548 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:53.548 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:53.548 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:53.548 143787 DEBUG oslo_concurrency.lockutils [req-ab74c0c5-01fc-42fc-baa4-fe9d86c13963 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:53.549 143779 DEBUG oslo_concurrency.lockutils [req-ab74c0c5-01fc-42fc-baa4-fe9d86c13963 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:53.549 143781 DEBUG oslo_concurrency.lockutils [req-ab74c0c5-01fc-42fc-baa4-fe9d86c13963 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:53.549 143787 DEBUG oslo_concurrency.lockutils [req-ab74c0c5-01fc-42fc-baa4-fe9d86c13963 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:53.549 143779 DEBUG oslo_concurrency.lockutils [req-ab74c0c5-01fc-42fc-baa4-fe9d86c13963 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:53.549 143780 DEBUG oslo_concurrency.lockutils [req-ab74c0c5-01fc-42fc-baa4-fe9d86c13963 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:53.549 143781 DEBUG oslo_concurrency.lockutils [req-ab74c0c5-01fc-42fc-baa4-fe9d86c13963 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:53.549 143780 DEBUG oslo_concurrency.lockutils [req-ab74c0c5-01fc-42fc-baa4-fe9d86c13963 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:53.550 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:53.550 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:53.550 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:53.550 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:53.550 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:53.550 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:53.551 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:53.551 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:53.551 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:53.551 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:53.552 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:53.552 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:53.605 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ef619685f5e34f6d953758a26d425c6b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:51:53.605 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ef619685f5e34f6d953758a26d425c6b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:51:53.605 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ef619685f5e34f6d953758a26d425c6b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:51:53.605 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ef619685f5e34f6d953758a26d425c6b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:51:53.606 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:53.606 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:53.606 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:53.606 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ef619685f5e34f6d953758a26d425c6b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:53.606 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:53.606 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ef619685f5e34f6d953758a26d425c6b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:53.606 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ef619685f5e34f6d953758a26d425c6b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:53.606 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ef619685f5e34f6d953758a26d425c6b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:51:53.606 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:53.606 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:53.606 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:53.606 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:53.606 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:53.606 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:53.606 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:53.606 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:53.607 143780 DEBUG oslo_concurrency.lockutils [req-f3324fd4-92ef-488f-99ae-6754d187ee8d 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:53.607 143780 DEBUG oslo_concurrency.lockutils [req-f3324fd4-92ef-488f-99ae-6754d187ee8d 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:53.607 143787 DEBUG oslo_concurrency.lockutils [req-f3324fd4-92ef-488f-99ae-6754d187ee8d 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:53.607 143781 DEBUG oslo_concurrency.lockutils [req-f3324fd4-92ef-488f-99ae-6754d187ee8d 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:53.607 143787 DEBUG oslo_concurrency.lockutils [req-f3324fd4-92ef-488f-99ae-6754d187ee8d 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:53.607 143779 DEBUG oslo_concurrency.lockutils [req-f3324fd4-92ef-488f-99ae-6754d187ee8d 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:53.607 143781 DEBUG oslo_concurrency.lockutils [req-f3324fd4-92ef-488f-99ae-6754d187ee8d 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:53.607 143779 DEBUG oslo_concurrency.lockutils [req-f3324fd4-92ef-488f-99ae-6754d187ee8d 11fa09a546a147a7b6232a47c6d5db68 562ef31119d44e399fd3c90ec6589724 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:51:53.607 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:53.608 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:53.608 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:53.608 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:53.608 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:53.608 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:53.608 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:53.609 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:53.609 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:53.609 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:53.609 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:53.609 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:54.608 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:54.609 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:54.609 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:54.609 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:54.609 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:54.609 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:54.609 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:54.610 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:54.610 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:54.611 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:54.611 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:54.611 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:56.610 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:56.610 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:56.610 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:56.610 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:56.610 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:56.611 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:56.611 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:56.611 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:56.611 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:56.612 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:51:56.612 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:51:56.612 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:51:59.129 143787 DEBUG oslo_service.periodic_task [req-3bb273ae-decf-44a7-b905-83fdff74ee02 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:51:59.133 143787 DEBUG oslo_concurrency.lockutils [req-c2c5fdbc-2f49-4029-ab19-45714064176e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:51:59.134 143787 DEBUG oslo_concurrency.lockutils [req-c2c5fdbc-2f49-4029-ab19-45714064176e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:00.613 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:00.613 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:00.613 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:00.613 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:00.613 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:00.614 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:00.614 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:00.614 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:00.614 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:00.617 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:00.617 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:00.618 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:08.616 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:08.616 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:08.617 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:08.617 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:08.617 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:08.617 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:08.618 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:08.618 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:08.618 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:08.621 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:08.622 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:08.622 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:12.103 143780 DEBUG oslo_service.periodic_task [req-aea1951c-bcc7-485a-9a03-5bb79901bd8b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:52:12.106 143780 DEBUG oslo_concurrency.lockutils [req-9fee73d6-a683-42b6-9eed-722052586433 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:12.107 143780 DEBUG oslo_concurrency.lockutils [req-9fee73d6-a683-42b6-9eed-722052586433 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:15.766 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 9b4ec88ec9c74c6ea5801faa0b0e80dc __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:52:15.766 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 9b4ec88ec9c74c6ea5801faa0b0e80dc __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:52:15.766 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 9b4ec88ec9c74c6ea5801faa0b0e80dc __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:52:15.767 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:15.767 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:15.767 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:15.767 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9b4ec88ec9c74c6ea5801faa0b0e80dc poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:52:15.767 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9b4ec88ec9c74c6ea5801faa0b0e80dc poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:52:15.767 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9b4ec88ec9c74c6ea5801faa0b0e80dc poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:52:15.767 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:15.767 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:15.767 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:15.767 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:15.767 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:15.767 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:15.768 143781 DEBUG oslo_concurrency.lockutils [req-0e0ea67e-addf-4c90-ac04-707464221bb2 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:15.768 143779 DEBUG oslo_concurrency.lockutils [req-0e0ea67e-addf-4c90-ac04-707464221bb2 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:15.768 143781 DEBUG nova.scheduler.host_manager [req-0e0ea67e-addf-4c90-ac04-707464221bb2 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:52:15.768 143779 DEBUG nova.scheduler.host_manager [req-0e0ea67e-addf-4c90-ac04-707464221bb2 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:52:15.768 143781 DEBUG oslo_concurrency.lockutils [req-0e0ea67e-addf-4c90-ac04-707464221bb2 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:15.768 143787 DEBUG oslo_concurrency.lockutils [req-0e0ea67e-addf-4c90-ac04-707464221bb2 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:15.768 143779 DEBUG oslo_concurrency.lockutils [req-0e0ea67e-addf-4c90-ac04-707464221bb2 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:15.768 143787 DEBUG nova.scheduler.host_manager [req-0e0ea67e-addf-4c90-ac04-707464221bb2 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:52:15.768 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:15.769 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:15.769 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:15.769 143787 DEBUG oslo_concurrency.lockutils [req-0e0ea67e-addf-4c90-ac04-707464221bb2 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:15.769 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 9b4ec88ec9c74c6ea5801faa0b0e80dc __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:52:15.769 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:15.769 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:15.769 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9b4ec88ec9c74c6ea5801faa0b0e80dc poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:52:15.769 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:15.769 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:15.769 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:15.770 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:15.769 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:15.770 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:15.770 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:15.770 143780 DEBUG oslo_concurrency.lockutils [req-0e0ea67e-addf-4c90-ac04-707464221bb2 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:15.770 143780 DEBUG nova.scheduler.host_manager [req-0e0ea67e-addf-4c90-ac04-707464221bb2 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:52:15.770 143780 DEBUG oslo_concurrency.lockutils [req-0e0ea67e-addf-4c90-ac04-707464221bb2 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:15.772 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:15.772 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:15.772 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:16.057 143779 DEBUG oslo_service.periodic_task [req-263b7112-41a7-44dc-a57b-403bce4d2b23 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:52:16.061 143779 DEBUG oslo_concurrency.lockutils [req-a54e2127-96a8-45b0-a850-683581b3c53b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:16.062 143779 DEBUG oslo_concurrency.lockutils [req-a54e2127-96a8-45b0-a850-683581b3c53b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:16.769 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:16.770 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:16.770 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:16.770 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:16.771 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:16.771 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:16.771 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:16.771 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:16.772 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:16.773 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:16.773 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:16.774 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:18.772 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:18.772 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:18.772 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:18.773 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:18.774 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:18.774 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:18.774 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:18.774 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:18.774 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:18.776 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:18.776 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:18.776 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:22.041 143781 DEBUG oslo_service.periodic_task [req-f39617ca-d696-4c10-acd8-c39a49e9140a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:52:22.046 143781 DEBUG oslo_concurrency.lockutils [req-c38888da-ad96-49d6-95a3-9497179c70f1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:22.046 143781 DEBUG oslo_concurrency.lockutils [req-c38888da-ad96-49d6-95a3-9497179c70f1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:22.776 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:22.776 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:22.776 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:22.777 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:22.777 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:22.777 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:22.781 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:22.781 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:22.781 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:22.781 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:22.782 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:22.782 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:29.139 143787 DEBUG oslo_service.periodic_task [req-c2c5fdbc-2f49-4029-ab19-45714064176e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:52:29.145 143787 DEBUG oslo_concurrency.lockutils [req-6c6a0730-2e01-4b5d-8435-3169d15a3af8 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:29.145 143787 DEBUG oslo_concurrency.lockutils [req-6c6a0730-2e01-4b5d-8435-3169d15a3af8 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:30.781 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:30.781 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:30.781 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:30.782 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:30.783 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:30.783 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:30.783 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:30.784 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:30.784 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:30.787 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:30.788 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:30.788 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:31.913 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 786c8a61cc37452d879a5033ae47d413 reply to reply_c93f245d97184cab8c1af1830c6ac100 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:52:31.913 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:31.913 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 144d42a1255b45f284327d554cb429f5 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:52:31.914 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:31.914 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:31.916 143787 DEBUG nova.scheduler.manager [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['b42fc1c5-d0b0-4212-8037-086cff8a5aca'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:52:31.921 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:31.927 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:31.927 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:31.940 143787 DEBUG nova.scheduler.request_filter [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:52:31.941 143787 DEBUG nova.scheduler.request_filter [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:52:31.941 143787 DEBUG nova.scheduler.request_filter [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:52:31.942 143787 DEBUG nova.scheduler.request_filter [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:52:32.376 143787 DEBUG nova.scheduler.request_filter [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.4 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:52:32.382 143787 DEBUG oslo_concurrency.lockutils [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:32.382 143787 DEBUG oslo_concurrency.lockutils [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:32.811 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: f412f54172514d60ba768e37c3f811b8 reply to reply_02dbf4ca539d4c419255761ef99731cf __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:52:32.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:32.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: e9415e344f3d484c8ca94902030db4f8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:52:32.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:32.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:32.814 143781 DEBUG nova.scheduler.manager [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['73310ea8-fda6-4f15-91bf-bc59cb14b9e6'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:52:32.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:32.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:32.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:32.819 143781 DEBUG nova.scheduler.request_filter [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:52:32.819 143781 DEBUG nova.scheduler.request_filter [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:52:32.819 143781 DEBUG nova.scheduler.request_filter [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:52:32.820 143781 DEBUG nova.scheduler.request_filter [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:52:32.928 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:32.929 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:32.929 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:33.165 143787 DEBUG oslo_messaging._drivers.amqpdriver [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 26c7189116d54782b38eb24bc0f19911 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:52:33.170 143787 DEBUG oslo_concurrency.lockutils [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:33.170 143787 DEBUG oslo_concurrency.lockutils [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:33.183 143787 DEBUG oslo_concurrency.lockutils [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:33.183 143787 DEBUG nova.scheduler.host_manager [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:52:28Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:52:33.185 143787 DEBUG nova.scheduler.host_manager [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:52:33.185 143787 DEBUG nova.scheduler.host_manager [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 440, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 52, 31, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 52, 31, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:52:33.185 143787 DEBUG nova.scheduler.host_manager [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:52:33.186 143787 DEBUG oslo_concurrency.lockutils [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.003s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:33.186 143787 INFO nova.scheduler.host_manager [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:52:33.186 143787 DEBUG nova.scheduler.manager [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:52:33.186 143787 DEBUG nova.scheduler.manager [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:52:33.187 143787 DEBUG nova.scheduler.utils [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance b42fc1c5-d0b0-4212-8037-086cff8a5aca claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:52:33.295 143787 DEBUG nova.scheduler.manager [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: b42fc1c5-d0b0-4212-8037-086cff8a5aca] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:52:33.296 143787 DEBUG oslo_concurrency.lockutils [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:33.296 143787 DEBUG oslo_concurrency.lockutils [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:33.298 143787 DEBUG oslo_messaging._drivers.amqpdriver [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: a1ed0d62081a4e00b391c79c984f6683 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:52:33.304 143787 DEBUG oslo_messaging._drivers.amqpdriver [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: 786c8a61cc37452d879a5033ae47d413 reply queue: reply_c93f245d97184cab8c1af1830c6ac100 time elapsed: 1.3902848600000652s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:52:33.497 143781 DEBUG nova.scheduler.request_filter [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.7 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:52:33.504 143781 DEBUG oslo_concurrency.lockutils [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:33.505 143781 DEBUG oslo_concurrency.lockutils [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:33.567 143781 DEBUG oslo_messaging._drivers.amqpdriver [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: ec34524b64a04184980a617b7483e494 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:52:33.569 143781 DEBUG oslo_concurrency.lockutils [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:33.569 143781 DEBUG oslo_concurrency.lockutils [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:33.585 143781 DEBUG oslo_concurrency.lockutils [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:33.585 143781 DEBUG nova.scheduler.host_manager [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:52:28Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:52:33.586 143781 DEBUG nova.scheduler.host_manager [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:52:33.588 143781 DEBUG nova.scheduler.host_manager [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 440, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 52, 31, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 52, 31, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:52:33.588 143781 DEBUG nova.scheduler.host_manager [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:52:33.588 143781 DEBUG oslo_concurrency.lockutils [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.003s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:33.589 143781 INFO nova.scheduler.host_manager [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:52:33.589 143781 DEBUG nova.scheduler.manager [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:52:33.589 143781 DEBUG nova.scheduler.manager [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:52:33.589 143781 DEBUG nova.scheduler.utils [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance 73310ea8-fda6-4f15-91bf-bc59cb14b9e6 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:52:33.662 143781 DEBUG nova.scheduler.manager [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: 73310ea8-fda6-4f15-91bf-bc59cb14b9e6] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:52:33.662 143781 DEBUG oslo_concurrency.lockutils [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:33.664 143781 DEBUG oslo_concurrency.lockutils [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:33.665 143781 DEBUG oslo_messaging._drivers.amqpdriver [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 3256ba90869844acafe3db8581d5aa51 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:52:33.667 143781 DEBUG oslo_messaging._drivers.amqpdriver [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: f412f54172514d60ba768e37c3f811b8 reply queue: reply_02dbf4ca539d4c419255761ef99731cf time elapsed: 0.8549349160002748s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:52:33.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:33.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:33.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:34.930 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:34.930 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:34.930 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:35.818 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:35.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:35.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:37.250 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c64eefc205244dbd82a7bcb9958a01c6 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:52:37.250 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c64eefc205244dbd82a7bcb9958a01c6 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:52:37.250 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c64eefc205244dbd82a7bcb9958a01c6 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:52:37.251 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:37.251 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:37.251 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c64eefc205244dbd82a7bcb9958a01c6 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:52:37.251 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:37.251 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c64eefc205244dbd82a7bcb9958a01c6 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:52:37.251 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c64eefc205244dbd82a7bcb9958a01c6 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:52:37.251 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c64eefc205244dbd82a7bcb9958a01c6 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:52:37.251 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:37.251 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:37.251 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:37.251 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:37.251 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:37.251 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:37.251 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c64eefc205244dbd82a7bcb9958a01c6 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:52:37.251 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:37.252 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:37.252 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:37.253 143780 DEBUG oslo_concurrency.lockutils [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:37.253 143781 DEBUG oslo_concurrency.lockutils [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:37.253 143780 DEBUG oslo_concurrency.lockutils [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:37.253 143787 DEBUG oslo_concurrency.lockutils [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:37.253 143781 DEBUG oslo_concurrency.lockutils [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:37.254 143787 DEBUG oslo_concurrency.lockutils [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:37.254 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 61492930849048459b3696857ce44778 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:52:37.254 143779 DEBUG oslo_concurrency.lockutils [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:37.255 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:37.255 143779 DEBUG oslo_concurrency.lockutils [req-3829c1da-300e-4724-a9e7-fecfdfbaa4db 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:37.256 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 61492930849048459b3696857ce44778 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:52:37.256 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:37.256 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:37.256 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 61492930849048459b3696857ce44778 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:52:37.256 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:37.256 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 61492930849048459b3696857ce44778 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:52:37.256 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 61492930849048459b3696857ce44778 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:52:37.256 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:37.257 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:37.257 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:37.257 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 61492930849048459b3696857ce44778 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:52:37.257 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:37.257 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:37.258 143780 DEBUG oslo_concurrency.lockutils [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:37.258 143780 DEBUG oslo_concurrency.lockutils [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:37.259 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:37.259 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:37.259 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:37.260 143787 DEBUG oslo_concurrency.lockutils [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:37.260 143787 DEBUG oslo_concurrency.lockutils [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:37.262 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:37.262 143779 DEBUG oslo_concurrency.lockutils [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:37.262 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:37.262 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:37.262 143779 DEBUG oslo_concurrency.lockutils [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:37.255 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 61492930849048459b3696857ce44778 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:52:37.263 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:37.264 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:37.265 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 61492930849048459b3696857ce44778 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:52:37.265 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:37.265 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:37.263 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:37.266 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:37.268 143781 DEBUG oslo_concurrency.lockutils [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:37.268 143781 DEBUG oslo_concurrency.lockutils [req-108a7577-0138-47e4-a5df-2a64e0f64624 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:37.268 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:37.269 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:37.269 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:38.261 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:38.261 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:38.261 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:38.263 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:38.264 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:38.264 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:38.268 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:38.268 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:38.268 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:38.270 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:38.270 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:38.271 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:40.263 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:40.264 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:40.264 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:40.266 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:40.266 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:40.266 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:40.270 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:40.270 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:40.271 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:40.272 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:40.272 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:40.272 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:42.112 143780 DEBUG oslo_service.periodic_task [req-9fee73d6-a683-42b6-9eed-722052586433 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:52:42.116 143780 DEBUG oslo_concurrency.lockutils [req-d0d8911f-598a-4d12-8f4c-01721292d464 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:42.117 143780 DEBUG oslo_concurrency.lockutils [req-d0d8911f-598a-4d12-8f4c-01721292d464 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:44.267 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:44.267 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:44.267 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:44.271 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:44.271 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:44.271 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:44.274 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:44.274 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:44.274 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:44.276 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:44.276 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:44.276 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:46.067 143779 DEBUG oslo_service.periodic_task [req-a54e2127-96a8-45b0-a850-683581b3c53b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:52:46.070 143779 DEBUG oslo_concurrency.lockutils [req-653613ba-acca-4202-9c37-546664dba9b5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:46.071 143779 DEBUG oslo_concurrency.lockutils [req-653613ba-acca-4202-9c37-546664dba9b5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:52.057 143781 DEBUG oslo_service.periodic_task [req-c38888da-ad96-49d6-95a3-9497179c70f1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:52:52.061 143781 DEBUG oslo_concurrency.lockutils [req-e4e8faa0-2840-473d-9752-b81cad27a9f9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:52.061 143781 DEBUG oslo_concurrency.lockutils [req-e4e8faa0-2840-473d-9752-b81cad27a9f9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:52.274 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:52.274 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:52.274 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:52.275 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:52.276 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:52.276 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:52.278 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:52.278 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:52.278 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:52.279 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:52.280 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:52.280 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:54.792 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: b2b82d7ad9fe471c9bd49d7deedc8dec __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:52:54.792 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: b2b82d7ad9fe471c9bd49d7deedc8dec __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:52:54.792 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: b2b82d7ad9fe471c9bd49d7deedc8dec __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:52:54.792 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:54.792 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:54.792 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:54.793 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: b2b82d7ad9fe471c9bd49d7deedc8dec poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:52:54.793 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: b2b82d7ad9fe471c9bd49d7deedc8dec poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:52:54.793 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: b2b82d7ad9fe471c9bd49d7deedc8dec poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:52:54.793 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:54.793 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:54.793 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:54.793 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:54.793 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:54.793 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:54.794 143779 DEBUG oslo_concurrency.lockutils [req-b4f20699-15f8-4f87-b3a0-fc643e5b18dc 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:54.794 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: b2b82d7ad9fe471c9bd49d7deedc8dec __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:52:54.794 143780 DEBUG oslo_concurrency.lockutils [req-b4f20699-15f8-4f87-b3a0-fc643e5b18dc 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:54.794 143781 DEBUG oslo_concurrency.lockutils [req-b4f20699-15f8-4f87-b3a0-fc643e5b18dc 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:54.795 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:54.795 143781 DEBUG oslo_concurrency.lockutils [req-b4f20699-15f8-4f87-b3a0-fc643e5b18dc 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:54.795 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: b2b82d7ad9fe471c9bd49d7deedc8dec poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:52:54.795 143779 DEBUG oslo_concurrency.lockutils [req-b4f20699-15f8-4f87-b3a0-fc643e5b18dc 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:54.795 143780 DEBUG oslo_concurrency.lockutils [req-b4f20699-15f8-4f87-b3a0-fc643e5b18dc 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:54.795 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:54.795 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:54.795 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:54.795 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:54.795 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:54.795 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:54.795 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:54.795 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:54.795 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:54.795 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:54.795 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:54.796 143787 DEBUG oslo_concurrency.lockutils [req-b4f20699-15f8-4f87-b3a0-fc643e5b18dc 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:54.796 143787 DEBUG oslo_concurrency.lockutils [req-b4f20699-15f8-4f87-b3a0-fc643e5b18dc 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:54.797 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:54.797 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:54.798 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:55.796 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:55.796 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:55.796 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:55.797 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:55.797 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:55.797 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:55.797 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:55.797 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:55.797 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:55.799 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:55.799 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:55.800 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:57.799 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:57.799 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:57.799 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:57.800 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:57.800 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:57.800 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:57.800 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:57.800 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:57.800 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:57.802 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:57.802 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:57.802 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:59.926 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 6d469cf7619341da82093d26f748f7de __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:52:59.926 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 6d469cf7619341da82093d26f748f7de __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:52:59.926 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 6d469cf7619341da82093d26f748f7de __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:52:59.926 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:59.926 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:59.926 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:59.926 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 6d469cf7619341da82093d26f748f7de poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:52:59.926 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 6d469cf7619341da82093d26f748f7de poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:52:59.926 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 6d469cf7619341da82093d26f748f7de poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:52:59.926 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:59.926 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:59.926 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:59.926 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:59.926 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:59.926 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:59.927 143787 DEBUG oslo_concurrency.lockutils [req-db3e2cad-4c2d-4e3f-9c20-a63dbb42718c 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:59.927 143779 DEBUG oslo_concurrency.lockutils [req-db3e2cad-4c2d-4e3f-9c20-a63dbb42718c 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:59.927 143780 DEBUG oslo_concurrency.lockutils [req-db3e2cad-4c2d-4e3f-9c20-a63dbb42718c 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:59.927 143787 DEBUG oslo_concurrency.lockutils [req-db3e2cad-4c2d-4e3f-9c20-a63dbb42718c 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:59.927 143779 DEBUG oslo_concurrency.lockutils [req-db3e2cad-4c2d-4e3f-9c20-a63dbb42718c 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:59.927 143780 DEBUG oslo_concurrency.lockutils [req-db3e2cad-4c2d-4e3f-9c20-a63dbb42718c 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:59.928 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 6d469cf7619341da82093d26f748f7de __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:52:59.928 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:59.928 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 6d469cf7619341da82093d26f748f7de poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:52:59.929 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:59.929 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:59.929 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:59.929 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:59.929 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:59.929 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:59.929 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:59.929 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:59.929 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:59.930 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:59.930 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:52:59.931 143781 DEBUG oslo_concurrency.lockutils [req-db3e2cad-4c2d-4e3f-9c20-a63dbb42718c 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:52:59.931 143781 DEBUG oslo_concurrency.lockutils [req-db3e2cad-4c2d-4e3f-9c20-a63dbb42718c 1fd5007b495c4bc8a1246d7e453892a2 2341ca2bd3ba4145aaee7cd8964b555b - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:52:59.932 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:52:59.932 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:52:59.932 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:00.082 143787 DEBUG oslo_service.periodic_task [req-6c6a0730-2e01-4b5d-8435-3169d15a3af8 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:53:00.087 143787 DEBUG oslo_concurrency.lockutils [req-6a596f26-2594-44d5-af8e-b640d3920b7b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:00.087 143787 DEBUG oslo_concurrency.lockutils [req-6a596f26-2594-44d5-af8e-b640d3920b7b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:00.930 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:00.930 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:00.930 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:00.930 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:00.931 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:00.931 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:00.931 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:00.931 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:00.931 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:00.933 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:00.934 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:00.934 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:02.259 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 5ef4f9a384ed4987b2fc80b5b1cd4798 reply to reply_54bc7bb5014144dab053290b62b8003d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:53:02.260 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:02.260 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1f1bda5dcc2142f4a50a5be2518bf0fe poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:02.260 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:02.260 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:02.262 143780 DEBUG nova.scheduler.manager [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['64644991-f27b-4dfc-855b-ce5108924b91'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:53:02.264 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:02.264 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:02.264 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:02.267 143780 DEBUG nova.scheduler.request_filter [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:53:02.267 143780 DEBUG nova.scheduler.request_filter [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:53:02.268 143780 DEBUG nova.scheduler.request_filter [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:53:02.268 143780 DEBUG nova.scheduler.request_filter [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:53:02.268 143780 DEBUG nova.scheduler.request_filter [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:53:02.273 143780 DEBUG oslo_concurrency.lockutils [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:02.273 143780 DEBUG oslo_concurrency.lockutils [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:02.330 143780 DEBUG oslo_messaging._drivers.amqpdriver [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 169aaad322334d40b6053c4a03aa5300 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:53:02.332 143780 DEBUG oslo_concurrency.lockutils [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:02.332 143780 DEBUG oslo_concurrency.lockutils [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:02.341 143780 DEBUG oslo_concurrency.lockutils [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:02.342 143780 DEBUG nova.scheduler.host_manager [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0',io_workload='0',num_instances='0',num_os_type_None='0',num_proj_2341ca2bd3ba4145aaee7cd8964b555b='0',num_task_None='0',num_vm_building='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:53:00Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:53:02.344 143780 DEBUG nova.scheduler.host_manager [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:53:02.344 143780 DEBUG nova.scheduler.host_manager [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 443, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 53, 1, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 53, 1, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:53:02.344 143780 DEBUG nova.scheduler.host_manager [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:53:02.344 143780 DEBUG oslo_concurrency.lockutils [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.003s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:02.345 143780 INFO nova.scheduler.host_manager [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:53:02.345 143780 DEBUG nova.scheduler.manager [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:53:02.346 143780 DEBUG nova.scheduler.manager [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:53:02.346 143780 DEBUG nova.scheduler.utils [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance 64644991-f27b-4dfc-855b-ce5108924b91 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:53:02.442 143780 DEBUG nova.scheduler.manager [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: 64644991-f27b-4dfc-855b-ce5108924b91] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:53:02.443 143780 DEBUG oslo_concurrency.lockutils [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:02.443 143780 DEBUG oslo_concurrency.lockutils [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:02.445 143780 DEBUG oslo_messaging._drivers.amqpdriver [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: b056f5925d9943b2a7eb215720d4d79b NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:53:02.447 143780 DEBUG oslo_messaging._drivers.amqpdriver [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: 5ef4f9a384ed4987b2fc80b5b1cd4798 reply queue: reply_54bc7bb5014144dab053290b62b8003d time elapsed: 0.18675745900054608s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:53:02.932 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:02.932 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:02.932 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:02.933 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:02.935 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:02.936 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:02.936 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:02.936 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:02.936 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:03.265 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:03.265 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:03.266 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:03.385 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: ba9ab4f2b70140f49f57a8813e5204a6 reply to reply_54bc7bb5014144dab053290b62b8003d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:53:03.385 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:03.386 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 406558a7bccb4bef93e0d8f346f757fd poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:03.386 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:03.386 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:03.387 143779 DEBUG nova.scheduler.manager [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['8a68c13b-ba77-4e64-aa63-bd038944f21f'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:53:03.389 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:03.389 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:03.390 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:03.392 143779 DEBUG nova.scheduler.request_filter [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:53:03.393 143779 DEBUG nova.scheduler.request_filter [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:53:03.393 143779 DEBUG nova.scheduler.request_filter [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:53:03.393 143779 DEBUG nova.scheduler.request_filter [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:53:03.393 143779 DEBUG nova.scheduler.request_filter [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:53:03.398 143779 DEBUG oslo_concurrency.lockutils [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:03.398 143779 DEBUG oslo_concurrency.lockutils [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:03.452 143779 DEBUG oslo_messaging._drivers.amqpdriver [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 3aa19254869b42d38bd108c621d2831f NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:53:03.454 143779 DEBUG oslo_concurrency.lockutils [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:03.454 143779 DEBUG oslo_concurrency.lockutils [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:03.462 143779 DEBUG oslo_concurrency.lockutils [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:03.462 143779 DEBUG nova.scheduler.host_manager [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=76,free_ram_mb=30363,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=1,mapped=1,memory_mb=31899,memory_mb_used=1536,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=1,service_id=None,stats={failed_builds='0',io_workload='1',num_instances='1',num_os_type_None='1',num_proj_2341ca2bd3ba4145aaee7cd8964b555b='0',num_proj_fcd11eb48a99473c970767c28064e907='1',num_task_None='1',num_vm_building='1'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:53:03Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=1) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:53:03.463 143779 DEBUG nova.scheduler.host_manager [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:53:03.463 143779 DEBUG nova.scheduler.host_manager [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 443, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 53, 1, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 53, 1, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:53:03.463 143779 DEBUG nova.scheduler.host_manager [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:53:03.464 143779 DEBUG oslo_concurrency.lockutils [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:03.464 143779 INFO nova.scheduler.host_manager [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:53:03.464 143779 DEBUG nova.scheduler.manager [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 26624MB io_ops: 1 instances: 1]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:53:03.464 143779 DEBUG nova.scheduler.manager [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 26624MB io_ops: 1 instances: 1, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:53:03.465 143779 DEBUG nova.scheduler.utils [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance 8a68c13b-ba77-4e64-aa63-bd038944f21f claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:53:03.528 143779 DEBUG nova.scheduler.manager [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: 8a68c13b-ba77-4e64-aa63-bd038944f21f] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 26624MB io_ops: 1 instances: 1 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:53:03.528 143779 DEBUG oslo_concurrency.lockutils [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:03.529 143779 DEBUG oslo_concurrency.lockutils [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:03.530 143779 DEBUG oslo_messaging._drivers.amqpdriver [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 5aae0833b8cd49d2980e9a3bd71c0c0f NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:53:03.533 143779 DEBUG oslo_messaging._drivers.amqpdriver [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: ba9ab4f2b70140f49f57a8813e5204a6 reply queue: reply_54bc7bb5014144dab053290b62b8003d time elapsed: 0.14737270000023273s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:53:04.392 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:04.392 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:04.392 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:05.268 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:05.268 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:05.269 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:05.659 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1d859a8ab7914c81a05726d9b6bb408f __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:53:05.659 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1d859a8ab7914c81a05726d9b6bb408f __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:53:05.660 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1d859a8ab7914c81a05726d9b6bb408f __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:53:05.660 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1d859a8ab7914c81a05726d9b6bb408f __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:53:05.660 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:05.660 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1d859a8ab7914c81a05726d9b6bb408f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:05.660 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:05.660 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:05.660 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1d859a8ab7914c81a05726d9b6bb408f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:05.660 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:05.660 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1d859a8ab7914c81a05726d9b6bb408f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:05.660 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:05.660 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:05.660 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1d859a8ab7914c81a05726d9b6bb408f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:05.660 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:05.660 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:05.661 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:05.661 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:05.661 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:05.661 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:05.662 143779 DEBUG oslo_concurrency.lockutils [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:05.662 143779 DEBUG oslo_concurrency.lockutils [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:05.663 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:05.663 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:05.663 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:05.663 143780 DEBUG oslo_concurrency.lockutils [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:05.663 143780 DEBUG oslo_concurrency.lockutils [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:05.663 143781 DEBUG oslo_concurrency.lockutils [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:05.664 143781 DEBUG oslo_concurrency.lockutils [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:05.664 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:05.664 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:05.664 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:05.664 143787 DEBUG oslo_concurrency.lockutils [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:05.664 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:05.665 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:05.665 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:05.665 143787 DEBUG oslo_concurrency.lockutils [req-1e3500f9-f0ee-456c-aea1-ed55d31991be 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:05.666 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:05.667 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:05.667 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:06.658 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f401143a4dbf46eb94916abb692dd977 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:53:06.658 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f401143a4dbf46eb94916abb692dd977 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:53:06.658 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:06.659 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f401143a4dbf46eb94916abb692dd977 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:06.658 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f401143a4dbf46eb94916abb692dd977 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:53:06.659 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:06.659 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:06.659 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f401143a4dbf46eb94916abb692dd977 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:06.659 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:06.659 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:06.659 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:06.659 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:06.659 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f401143a4dbf46eb94916abb692dd977 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:06.659 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:06.660 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:06.661 143787 DEBUG oslo_concurrency.lockutils [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:06.661 143781 DEBUG oslo_concurrency.lockutils [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:06.661 143787 DEBUG oslo_concurrency.lockutils [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:06.661 143781 DEBUG oslo_concurrency.lockutils [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:06.661 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:06.662 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:06.662 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:06.662 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:06.662 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:06.662 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f401143a4dbf46eb94916abb692dd977 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:53:06.662 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:06.662 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:06.662 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f401143a4dbf46eb94916abb692dd977 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:06.662 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:06.663 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:06.665 143780 DEBUG oslo_concurrency.lockutils [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:06.665 143780 DEBUG oslo_concurrency.lockutils [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:06.665 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:06.666 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:06.666 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:06.667 143779 DEBUG oslo_concurrency.lockutils [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:06.667 143779 DEBUG oslo_concurrency.lockutils [req-0b94e385-c281-4f33-bc95-46fe19672d40 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:06.668 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:06.668 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:06.668 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:07.662 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:07.663 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:07.663 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:07.664 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:07.664 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:07.664 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:07.667 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:07.668 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:07.668 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:07.669 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:07.670 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:07.670 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:09.666 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:09.666 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:09.666 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:09.666 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:09.666 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:09.666 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:09.669 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:09.670 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:09.670 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:09.672 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:09.673 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:09.673 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:13.045 143780 DEBUG oslo_service.periodic_task [req-d0d8911f-598a-4d12-8f4c-01721292d464 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:53:13.048 143780 DEBUG oslo_concurrency.lockutils [req-22b01fe8-d6ea-4624-88ca-da8f9f5b0425 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:13.049 143780 DEBUG oslo_concurrency.lockutils [req-22b01fe8-d6ea-4624-88ca-da8f9f5b0425 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:13.669 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:13.670 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:13.670 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:13.670 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:13.670 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:13.671 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:13.671 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:13.671 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:13.671 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:13.677 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:13.677 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:13.677 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:17.057 143779 DEBUG oslo_service.periodic_task [req-653613ba-acca-4202-9c37-546664dba9b5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:53:17.061 143779 DEBUG oslo_concurrency.lockutils [req-21394dba-7a24-41b2-8737-b7ab2c2ff550 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:17.061 143779 DEBUG oslo_concurrency.lockutils [req-21394dba-7a24-41b2-8737-b7ab2c2ff550 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:21.674 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:21.674 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:21.674 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:21.674 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:21.674 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:21.674 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:21.678 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:21.678 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:21.678 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:21.681 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:21.682 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:21.682 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:22.066 143781 DEBUG oslo_service.periodic_task [req-e4e8faa0-2840-473d-9752-b81cad27a9f9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:53:22.070 143781 DEBUG oslo_concurrency.lockutils [req-24a62af8-3e7f-422c-939f-0e2974b0dd8a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:22.071 143781 DEBUG oslo_concurrency.lockutils [req-24a62af8-3e7f-422c-939f-0e2974b0dd8a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:30.093 143787 DEBUG oslo_service.periodic_task [req-6a596f26-2594-44d5-af8e-b640d3920b7b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:53:30.098 143787 DEBUG oslo_concurrency.lockutils [req-fd79eab8-6a37-4c31-a4f3-54ffa8c7f9ca - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:30.098 143787 DEBUG oslo_concurrency.lockutils [req-fd79eab8-6a37-4c31-a4f3-54ffa8c7f9ca - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:34.350 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f356d5e98ff94d8c820c5ba0bbab6588 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:53:34.350 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f356d5e98ff94d8c820c5ba0bbab6588 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:53:34.350 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:34.350 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:34.350 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f356d5e98ff94d8c820c5ba0bbab6588 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:34.350 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f356d5e98ff94d8c820c5ba0bbab6588 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:53:34.350 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f356d5e98ff94d8c820c5ba0bbab6588 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:34.351 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:34.351 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:34.351 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:34.351 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:34.351 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:34.351 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f356d5e98ff94d8c820c5ba0bbab6588 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:34.351 143780 DEBUG oslo_concurrency.lockutils [req-5a9a38f8-dda3-4862-9917-1fdd797dafb5 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:34.351 143787 DEBUG oslo_concurrency.lockutils [req-5a9a38f8-dda3-4862-9917-1fdd797dafb5 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:34.352 143780 DEBUG oslo_concurrency.lockutils [req-5a9a38f8-dda3-4862-9917-1fdd797dafb5 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:34.352 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:34.352 143787 DEBUG oslo_concurrency.lockutils [req-5a9a38f8-dda3-4862-9917-1fdd797dafb5 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:34.352 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:34.352 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f356d5e98ff94d8c820c5ba0bbab6588 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:53:34.353 143779 DEBUG oslo_concurrency.lockutils [req-5a9a38f8-dda3-4862-9917-1fdd797dafb5 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:34.353 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:34.353 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:34.353 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:34.353 143779 DEBUG oslo_concurrency.lockutils [req-5a9a38f8-dda3-4862-9917-1fdd797dafb5 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:34.353 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:34.353 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:34.354 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:34.354 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f356d5e98ff94d8c820c5ba0bbab6588 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:34.354 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:34.354 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:34.354 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:34.354 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:34.354 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:34.354 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:34.355 143781 DEBUG oslo_concurrency.lockutils [req-5a9a38f8-dda3-4862-9917-1fdd797dafb5 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:34.355 143781 DEBUG oslo_concurrency.lockutils [req-5a9a38f8-dda3-4862-9917-1fdd797dafb5 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:34.356 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:34.356 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:34.356 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:34.397 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8888899efe0d4bb78d01670777ca3bdd __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:53:34.397 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8888899efe0d4bb78d01670777ca3bdd __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:53:34.397 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8888899efe0d4bb78d01670777ca3bdd __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:53:34.397 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8888899efe0d4bb78d01670777ca3bdd __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:53:34.398 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:34.398 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8888899efe0d4bb78d01670777ca3bdd poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:34.399 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:34.399 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:34.399 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:34.399 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:34.399 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8888899efe0d4bb78d01670777ca3bdd poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:34.399 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:34.399 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8888899efe0d4bb78d01670777ca3bdd poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:34.399 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:34.399 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8888899efe0d4bb78d01670777ca3bdd poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:34.399 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:34.399 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:34.399 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:34.399 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:34.399 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:34.399 143780 DEBUG oslo_concurrency.lockutils [req-bf181ca9-343e-445f-a247-786991fc1ded 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:34.400 143781 DEBUG oslo_concurrency.lockutils [req-bf181ca9-343e-445f-a247-786991fc1ded 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:34.400 143779 DEBUG oslo_concurrency.lockutils [req-bf181ca9-343e-445f-a247-786991fc1ded 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:34.400 143780 DEBUG oslo_concurrency.lockutils [req-bf181ca9-343e-445f-a247-786991fc1ded 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:34.400 143781 DEBUG oslo_concurrency.lockutils [req-bf181ca9-343e-445f-a247-786991fc1ded 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:34.400 143779 DEBUG oslo_concurrency.lockutils [req-bf181ca9-343e-445f-a247-786991fc1ded 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:34.401 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:34.401 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:34.401 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:34.401 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:34.401 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:34.402 143787 DEBUG oslo_concurrency.lockutils [req-bf181ca9-343e-445f-a247-786991fc1ded 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:34.402 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:34.402 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:34.402 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:34.402 143787 DEBUG oslo_concurrency.lockutils [req-bf181ca9-343e-445f-a247-786991fc1ded 45269c782736425784f40b17da1230e5 fcd11eb48a99473c970767c28064e907 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:34.402 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:34.402 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:34.403 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:34.403 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:35.402 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:35.402 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:35.402 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:35.402 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:35.403 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:35.403 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:35.403 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:35.403 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:35.403 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:35.403 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:35.404 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:35.404 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:37.404 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:37.404 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:37.404 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:37.404 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:37.405 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:37.405 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:37.405 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:37.405 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:37.405 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:37.405 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:37.405 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:37.406 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:41.409 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:41.409 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:41.409 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:41.409 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:41.410 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:41.410 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:41.410 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:41.410 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:41.410 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:41.410 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:41.411 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:41.411 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:43.967 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 339d14c1f0e6474b98e5b1726f7ea226 reply to reply_02dbf4ca539d4c419255761ef99731cf __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:53:43.967 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:43.967 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4e02a4cc1ca342baba6b44d12b89d821 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:43.968 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:43.968 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:43.970 143787 DEBUG nova.scheduler.manager [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['8827c6d6-073e-4243-8041-1f3b49ff461b'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:53:43.971 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:43.971 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:43.972 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:43.975 143787 DEBUG nova.scheduler.request_filter [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:53:43.975 143787 DEBUG nova.scheduler.request_filter [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:53:43.975 143787 DEBUG nova.scheduler.request_filter [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:53:43.976 143787 DEBUG nova.scheduler.request_filter [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:53:43.976 143787 DEBUG nova.scheduler.request_filter [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:53:43.980 143787 DEBUG oslo_concurrency.lockutils [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:43.981 143787 DEBUG oslo_concurrency.lockutils [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:44.035 143787 DEBUG oslo_messaging._drivers.amqpdriver [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 8aa5aa7b57314a8c86794c15f05fefba NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:53:44.036 143787 DEBUG oslo_concurrency.lockutils [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:44.037 143787 DEBUG oslo_concurrency.lockutils [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:44.044 143780 DEBUG oslo_service.periodic_task [req-22b01fe8-d6ea-4624-88ca-da8f9f5b0425 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:53:44.045 143787 DEBUG oslo_concurrency.lockutils [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:44.045 143787 DEBUG nova.scheduler.host_manager [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=24,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0',io_workload='0',num_instances='0',num_os_type_None='0',num_proj_fcd11eb48a99473c970767c28064e907='0',num_task_None='0',num_vm_active='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:53:34Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:53:44.046 143787 DEBUG nova.scheduler.host_manager [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:53:44.046 143787 DEBUG nova.scheduler.host_manager [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 447, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 53, 41, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 53, 41, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:53:44.046 143787 DEBUG nova.scheduler.host_manager [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:53:44.046 143787 DEBUG oslo_concurrency.lockutils [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:44.047 143787 INFO nova.scheduler.host_manager [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:53:44.047 143787 DEBUG nova.scheduler.manager [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:53:44.047 143787 DEBUG nova.scheduler.manager [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:53:44.047 143787 DEBUG nova.scheduler.utils [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance 8827c6d6-073e-4243-8041-1f3b49ff461b claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:53:44.047 143780 DEBUG oslo_concurrency.lockutils [req-08457efe-1409-46ad-9f07-e76542ff69fa - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:44.048 143780 DEBUG oslo_concurrency.lockutils [req-08457efe-1409-46ad-9f07-e76542ff69fa - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:44.130 143787 DEBUG nova.scheduler.manager [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: 8827c6d6-073e-4243-8041-1f3b49ff461b] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:53:44.130 143787 DEBUG oslo_concurrency.lockutils [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:44.131 143787 DEBUG oslo_concurrency.lockutils [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:44.132 143787 DEBUG oslo_messaging._drivers.amqpdriver [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 0028ffd33a81435e95ee80784bb82830 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:53:44.134 143787 DEBUG oslo_messaging._drivers.amqpdriver [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: 339d14c1f0e6474b98e5b1726f7ea226 reply queue: reply_02dbf4ca539d4c419255761ef99731cf time elapsed: 0.1665589580006781s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:53:44.973 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:44.974 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:44.974 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:45.228 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 1525635fa56645d48a8ca963209bce73 reply to reply_c93f245d97184cab8c1af1830c6ac100 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:53:45.228 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:45.228 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3479c7173f46410b82f55e69b8c61428 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:45.228 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:45.229 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:45.231 143781 DEBUG nova.scheduler.manager [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['cc7c7a98-a60c-4072-8d0f-aaf27d70b18f'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:53:45.233 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:45.233 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:45.234 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:45.237 143781 DEBUG nova.scheduler.request_filter [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:53:45.237 143781 DEBUG nova.scheduler.request_filter [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:53:45.238 143781 DEBUG nova.scheduler.request_filter [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:53:45.238 143781 DEBUG nova.scheduler.request_filter [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:53:45.238 143781 DEBUG nova.scheduler.request_filter [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:53:45.242 143781 DEBUG oslo_concurrency.lockutils [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:45.242 143781 DEBUG oslo_concurrency.lockutils [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:45.359 143781 DEBUG oslo_messaging._drivers.amqpdriver [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: d8d2159b5aaf43f2b8f8060c886721a4 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:53:45.361 143781 DEBUG oslo_concurrency.lockutils [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:45.362 143781 DEBUG oslo_concurrency.lockutils [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:45.375 143781 DEBUG oslo_concurrency.lockutils [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:45.376 143781 DEBUG nova.scheduler.host_manager [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=24,free_disk_gb=76,free_ram_mb=30363,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=1,mapped=1,memory_mb=31899,memory_mb_used=1536,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=1,service_id=None,stats={failed_builds='0',io_workload='1',num_instances='1',num_os_type_None='1',num_proj_740edae3923f40e2b70ff2cb9340a700='1',num_proj_fcd11eb48a99473c970767c28064e907='0',num_task_None='1',num_vm_active='0',num_vm_building='1'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:53:44Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=1) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:53:45.377 143781 DEBUG nova.scheduler.host_manager [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:53:45.377 143781 DEBUG nova.scheduler.host_manager [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 447, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 53, 41, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 53, 41, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:53:45.378 143781 DEBUG nova.scheduler.host_manager [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:53:45.378 143781 DEBUG oslo_concurrency.lockutils [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.003s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:45.378 143781 INFO nova.scheduler.host_manager [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:53:45.379 143781 DEBUG nova.scheduler.manager [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 24576MB io_ops: 1 instances: 1]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:53:45.379 143781 DEBUG nova.scheduler.manager [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 24576MB io_ops: 1 instances: 1, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:53:45.379 143781 DEBUG nova.scheduler.utils [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance cc7c7a98-a60c-4072-8d0f-aaf27d70b18f claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:53:45.449 143781 DEBUG nova.scheduler.manager [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: cc7c7a98-a60c-4072-8d0f-aaf27d70b18f] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 24576MB io_ops: 1 instances: 1 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:53:45.449 143781 DEBUG oslo_concurrency.lockutils [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:45.450 143781 DEBUG oslo_concurrency.lockutils [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:45.451 143781 DEBUG oslo_messaging._drivers.amqpdriver [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 95993a26f8194a7aaa4fb43a3eba39f7 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:53:45.453 143781 DEBUG oslo_messaging._drivers.amqpdriver [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: 1525635fa56645d48a8ca963209bce73 reply queue: reply_c93f245d97184cab8c1af1830c6ac100 time elapsed: 0.22494619999997667s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:53:46.236 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:46.236 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:46.236 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:46.975 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:46.975 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:46.975 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:47.301 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: bc942097d2744be08af18a9bb3a3545f __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:53:47.301 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: bc942097d2744be08af18a9bb3a3545f __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:53:47.301 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: bc942097d2744be08af18a9bb3a3545f __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:53:47.302 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:47.302 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:47.302 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:47.302 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: bc942097d2744be08af18a9bb3a3545f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:47.302 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: bc942097d2744be08af18a9bb3a3545f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:47.302 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: bc942097d2744be08af18a9bb3a3545f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:47.302 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:47.302 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:47.302 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:47.302 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:47.302 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:47.302 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:47.304 143781 DEBUG oslo_concurrency.lockutils [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:47.304 143780 DEBUG oslo_concurrency.lockutils [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:47.304 143781 DEBUG oslo_concurrency.lockutils [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:47.304 143779 DEBUG oslo_concurrency.lockutils [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:47.304 143780 DEBUG oslo_concurrency.lockutils [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:47.305 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:47.305 143779 DEBUG oslo_concurrency.lockutils [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:47.305 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:47.305 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:47.305 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:47.305 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:47.305 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:47.305 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:47.305 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:47.305 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:47.306 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: bc942097d2744be08af18a9bb3a3545f __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:53:47.307 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:47.307 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: bc942097d2744be08af18a9bb3a3545f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:47.307 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:47.307 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:47.309 143787 DEBUG oslo_concurrency.lockutils [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:47.309 143787 DEBUG oslo_concurrency.lockutils [req-78c45df1-36bd-4c5d-97ab-c5f1d93250c4 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:47.310 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:47.310 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:47.310 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:48.057 143779 DEBUG oslo_service.periodic_task [req-21394dba-7a24-41b2-8737-b7ab2c2ff550 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:53:48.061 143779 DEBUG oslo_concurrency.lockutils [req-11f906ca-e1d1-481e-ac32-2119ddbcff6c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:48.062 143779 DEBUG oslo_concurrency.lockutils [req-11f906ca-e1d1-481e-ac32-2119ddbcff6c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:48.306 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:48.306 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:48.306 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:48.307 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:48.307 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:48.307 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:48.307 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:48.307 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:48.307 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:48.311 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:48.311 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:48.311 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:48.925 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d5d6eb55b26b43059e67816ebee5f4a4 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:53:48.926 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:48.926 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d5d6eb55b26b43059e67816ebee5f4a4 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:48.926 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d5d6eb55b26b43059e67816ebee5f4a4 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:53:48.925 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d5d6eb55b26b43059e67816ebee5f4a4 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:53:48.926 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:48.926 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:48.926 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:48.926 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:48.926 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d5d6eb55b26b43059e67816ebee5f4a4 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:48.926 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d5d6eb55b26b43059e67816ebee5f4a4 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:48.926 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:48.926 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:48.927 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:48.927 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:48.928 143787 DEBUG oslo_concurrency.lockutils [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:48.928 143780 DEBUG oslo_concurrency.lockutils [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:48.929 143787 DEBUG oslo_concurrency.lockutils [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:48.929 143780 DEBUG oslo_concurrency.lockutils [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:48.929 143779 DEBUG oslo_concurrency.lockutils [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:48.929 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:48.929 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:48.929 143779 DEBUG oslo_concurrency.lockutils [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:48.929 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:48.929 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:48.930 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:48.930 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:48.930 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:48.930 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d5d6eb55b26b43059e67816ebee5f4a4 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:53:48.931 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:48.931 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d5d6eb55b26b43059e67816ebee5f4a4 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:53:48.931 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:48.931 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:48.934 143781 DEBUG oslo_concurrency.lockutils [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:48.934 143781 DEBUG oslo_concurrency.lockutils [req-ffc6e9d8-03e6-42db-87ed-1c694543a17f 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:48.934 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:48.935 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:48.935 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:48.935 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:48.935 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:49.930 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:49.931 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:49.931 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:49.932 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:49.932 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:49.932 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:49.936 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:49.936 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:49.936 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:49.937 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:49.937 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:49.937 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:51.933 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:51.934 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:51.934 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:51.935 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:51.935 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:51.935 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:51.938 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:51.939 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:51.939 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:51.940 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:51.940 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:51.940 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:53.041 143781 DEBUG oslo_service.periodic_task [req-24a62af8-3e7f-422c-939f-0e2974b0dd8a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:53:53.045 143781 DEBUG oslo_concurrency.lockutils [req-5694ef34-e8ff-4801-b030-aba1c04c45ba - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:53:53.046 143781 DEBUG oslo_concurrency.lockutils [req-5694ef34-e8ff-4801-b030-aba1c04c45ba - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:53:55.935 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:55.935 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:55.935 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:55.939 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:55.939 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:55.939 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:55.941 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:55.941 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:55.942 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:53:55.943 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:53:55.943 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:53:55.943 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:01.083 143787 DEBUG oslo_service.periodic_task [req-fd79eab8-6a37-4c31-a4f3-54ffa8c7f9ca - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:54:01.087 143787 DEBUG oslo_concurrency.lockutils [req-21bbad82-a23c-4417-b34e-91b00c91aca3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:01.087 143787 DEBUG oslo_concurrency.lockutils [req-21bbad82-a23c-4417-b34e-91b00c91aca3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:03.937 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:03.937 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:03.937 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:03.940 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:03.942 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:03.942 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:03.943 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:03.944 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:03.944 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:03.949 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:03.950 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:03.950 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:06.104 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 206bad7251cb46d2ba066eaf2a7332d2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:54:06.104 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 206bad7251cb46d2ba066eaf2a7332d2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:54:06.104 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 206bad7251cb46d2ba066eaf2a7332d2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:54:06.104 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:06.104 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:06.104 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 206bad7251cb46d2ba066eaf2a7332d2 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:54:06.104 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:06.104 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 206bad7251cb46d2ba066eaf2a7332d2 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:54:06.105 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 206bad7251cb46d2ba066eaf2a7332d2 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:54:06.105 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:06.105 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:06.105 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:06.105 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:06.105 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:06.105 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:06.105 143787 DEBUG oslo_concurrency.lockutils [req-5e45a195-501f-4c72-b1b1-5fa86d47f58a 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:06.105 143779 DEBUG oslo_concurrency.lockutils [req-5e45a195-501f-4c72-b1b1-5fa86d47f58a 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:06.106 143787 DEBUG oslo_concurrency.lockutils [req-5e45a195-501f-4c72-b1b1-5fa86d47f58a 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:06.106 143781 DEBUG oslo_concurrency.lockutils [req-5e45a195-501f-4c72-b1b1-5fa86d47f58a 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:06.106 143779 DEBUG oslo_concurrency.lockutils [req-5e45a195-501f-4c72-b1b1-5fa86d47f58a 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:06.106 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 206bad7251cb46d2ba066eaf2a7332d2 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:54:06.106 143781 DEBUG oslo_concurrency.lockutils [req-5e45a195-501f-4c72-b1b1-5fa86d47f58a 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:06.106 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:06.106 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 206bad7251cb46d2ba066eaf2a7332d2 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:54:06.108 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:06.108 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:06.108 143780 DEBUG oslo_concurrency.lockutils [req-5e45a195-501f-4c72-b1b1-5fa86d47f58a 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:06.109 143780 DEBUG oslo_concurrency.lockutils [req-5e45a195-501f-4c72-b1b1-5fa86d47f58a 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:06.107 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:06.107 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:06.109 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:06.109 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:06.109 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:06.109 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:06.107 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:06.110 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:06.110 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:06.112 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:06.112 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:06.112 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:06.156 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ee072113d98a4b3788c462ed04d7a4aa __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:54:06.156 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ee072113d98a4b3788c462ed04d7a4aa __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:54:06.156 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ee072113d98a4b3788c462ed04d7a4aa __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:54:06.156 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:06.156 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:06.156 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:06.157 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ee072113d98a4b3788c462ed04d7a4aa poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:54:06.157 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ee072113d98a4b3788c462ed04d7a4aa poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:54:06.157 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ee072113d98a4b3788c462ed04d7a4aa poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:54:06.157 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:06.157 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:06.157 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:06.157 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:06.157 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:06.157 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:06.158 143779 DEBUG oslo_concurrency.lockutils [req-8ed85cbe-8fb1-44d1-bb73-2b3f8b2b3cc6 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:06.158 143787 DEBUG oslo_concurrency.lockutils [req-8ed85cbe-8fb1-44d1-bb73-2b3f8b2b3cc6 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:06.158 143781 DEBUG oslo_concurrency.lockutils [req-8ed85cbe-8fb1-44d1-bb73-2b3f8b2b3cc6 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:06.158 143779 DEBUG oslo_concurrency.lockutils [req-8ed85cbe-8fb1-44d1-bb73-2b3f8b2b3cc6 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:06.158 143787 DEBUG oslo_concurrency.lockutils [req-8ed85cbe-8fb1-44d1-bb73-2b3f8b2b3cc6 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:06.158 143781 DEBUG oslo_concurrency.lockutils [req-8ed85cbe-8fb1-44d1-bb73-2b3f8b2b3cc6 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:06.159 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ee072113d98a4b3788c462ed04d7a4aa __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:54:06.159 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:06.159 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:06.159 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:06.160 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:06.159 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:06.159 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:06.160 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:06.160 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:06.160 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:06.160 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:06.160 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ee072113d98a4b3788c462ed04d7a4aa poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:54:06.160 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:06.160 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:06.161 143780 DEBUG oslo_concurrency.lockutils [req-8ed85cbe-8fb1-44d1-bb73-2b3f8b2b3cc6 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:06.161 143780 DEBUG oslo_concurrency.lockutils [req-8ed85cbe-8fb1-44d1-bb73-2b3f8b2b3cc6 1afcb03dd1824c139dac3a43c5ad9a78 740edae3923f40e2b70ff2cb9340a700 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:06.163 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:06.163 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:06.163 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:07.160 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:07.161 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:07.161 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:07.161 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:07.161 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:07.161 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:07.161 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:07.161 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:07.161 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:07.164 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:07.164 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:07.164 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:09.163 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:09.163 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:09.164 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:09.164 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:09.164 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:09.164 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:09.164 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:09.165 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:09.165 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:09.167 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:09.167 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:09.167 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:13.166 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:13.166 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:13.166 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:13.168 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:13.168 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:13.168 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:13.169 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:13.170 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:13.170 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:13.172 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:13.172 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:13.173 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:14.053 143780 DEBUG oslo_service.periodic_task [req-08457efe-1409-46ad-9f07-e76542ff69fa - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:54:14.056 143780 DEBUG oslo_concurrency.lockutils [req-6db60410-8672-4d64-b445-cc6c8487f116 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:14.057 143780 DEBUG oslo_concurrency.lockutils [req-6db60410-8672-4d64-b445-cc6c8487f116 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:16.598 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1081e3d9cd6f4f0fa845943e3495a359 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:54:16.598 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1081e3d9cd6f4f0fa845943e3495a359 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:54:16.598 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1081e3d9cd6f4f0fa845943e3495a359 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:54:16.598 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1081e3d9cd6f4f0fa845943e3495a359 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:54:16.599 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:16.599 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:16.599 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1081e3d9cd6f4f0fa845943e3495a359 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:54:16.599 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1081e3d9cd6f4f0fa845943e3495a359 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:54:16.599 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:16.599 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:16.599 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1081e3d9cd6f4f0fa845943e3495a359 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:54:16.599 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:16.599 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:16.599 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1081e3d9cd6f4f0fa845943e3495a359 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:54:16.599 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:16.599 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:16.599 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:16.599 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:16.599 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:16.599 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:16.600 143779 DEBUG oslo_concurrency.lockutils [req-749c4bf9-da4e-47e9-9482-62a07843cdcb - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:16.600 143780 DEBUG oslo_concurrency.lockutils [req-749c4bf9-da4e-47e9-9482-62a07843cdcb - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:16.600 143779 DEBUG nova.scheduler.host_manager [req-749c4bf9-da4e-47e9-9482-62a07843cdcb - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:54:16.600 143780 DEBUG nova.scheduler.host_manager [req-749c4bf9-da4e-47e9-9482-62a07843cdcb - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:54:16.600 143781 DEBUG oslo_concurrency.lockutils [req-749c4bf9-da4e-47e9-9482-62a07843cdcb - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:16.600 143779 DEBUG oslo_concurrency.lockutils [req-749c4bf9-da4e-47e9-9482-62a07843cdcb - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:16.600 143787 DEBUG oslo_concurrency.lockutils [req-749c4bf9-da4e-47e9-9482-62a07843cdcb - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:16.600 143780 DEBUG oslo_concurrency.lockutils [req-749c4bf9-da4e-47e9-9482-62a07843cdcb - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:16.600 143781 DEBUG nova.scheduler.host_manager [req-749c4bf9-da4e-47e9-9482-62a07843cdcb - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:54:16.600 143787 DEBUG nova.scheduler.host_manager [req-749c4bf9-da4e-47e9-9482-62a07843cdcb - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:54:16.600 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:16.600 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:16.600 143781 DEBUG oslo_concurrency.lockutils [req-749c4bf9-da4e-47e9-9482-62a07843cdcb - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:16.601 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:16.601 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:16.601 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:16.601 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:16.601 143787 DEBUG oslo_concurrency.lockutils [req-749c4bf9-da4e-47e9-9482-62a07843cdcb - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:16.601 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:16.601 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:16.601 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:16.602 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:16.602 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:16.602 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:17.602 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:17.602 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:17.602 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:17.602 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:17.602 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:17.602 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:17.603 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:17.603 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:17.604 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:17.604 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:17.604 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:17.604 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:19.057 143779 DEBUG oslo_service.periodic_task [req-11f906ca-e1d1-481e-ac32-2119ddbcff6c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:54:19.060 143779 DEBUG oslo_concurrency.lockutils [req-a798b454-a84b-4228-82fe-d26e44e13e22 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:19.060 143779 DEBUG oslo_concurrency.lockutils [req-a798b454-a84b-4228-82fe-d26e44e13e22 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:19.604 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:19.604 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:19.604 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:19.604 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:19.604 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:19.604 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:19.605 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:19.605 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:19.605 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:19.605 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:19.606 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:19.606 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:23.607 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:23.607 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:23.607 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:23.607 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:23.607 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:23.607 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:23.608 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:23.608 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:23.608 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:23.609 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:23.610 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:23.610 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:24.041 143781 DEBUG oslo_service.periodic_task [req-5694ef34-e8ff-4801-b030-aba1c04c45ba - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:54:24.045 143781 DEBUG oslo_concurrency.lockutils [req-04a3bac1-268f-4a00-aca6-28f5e5f6aec8 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:24.046 143781 DEBUG oslo_concurrency.lockutils [req-04a3bac1-268f-4a00-aca6-28f5e5f6aec8 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:31.610 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:31.610 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:31.610 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:31.611 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:31.612 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:31.612 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:31.613 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:31.613 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:31.614 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:31.614 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:31.614 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:31.614 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:32.083 143787 DEBUG oslo_service.periodic_task [req-21bbad82-a23c-4417-b34e-91b00c91aca3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:54:32.086 143787 DEBUG oslo_concurrency.lockutils [req-9cfcfc38-fea1-416a-952b-30307f15eb42 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:32.087 143787 DEBUG oslo_concurrency.lockutils [req-9cfcfc38-fea1-416a-952b-30307f15eb42 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:42.402 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 3b30d99f8d2f479ea45183040ba42111 reply to reply_c93f245d97184cab8c1af1830c6ac100 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:54:42.402 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:42.403 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ae5b317c05614abb8d437bab248b97c7 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:54:42.403 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:42.403 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:42.404 143780 DEBUG nova.scheduler.manager [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['a91e8911-08d1-4caf-bda4-70f0c23d8f0d'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:54:42.406 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:42.407 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:42.407 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:42.411 143780 DEBUG nova.scheduler.request_filter [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:54:42.411 143780 DEBUG nova.scheduler.request_filter [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:54:42.411 143780 DEBUG nova.scheduler.request_filter [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:54:42.411 143780 DEBUG nova.scheduler.request_filter [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:54:42.412 143780 DEBUG nova.scheduler.request_filter [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:54:42.415 143780 DEBUG oslo_concurrency.lockutils [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:42.416 143780 DEBUG oslo_concurrency.lockutils [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:42.511 143780 DEBUG oslo_messaging._drivers.amqpdriver [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 6f11efd4b17b4dee8b932056887d6e1e NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:54:42.512 143780 DEBUG oslo_concurrency.lockutils [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:42.513 143780 DEBUG oslo_concurrency.lockutils [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:42.521 143780 DEBUG oslo_concurrency.lockutils [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:42.521 143780 DEBUG nova.scheduler.host_manager [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:54:30Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:54:42.522 143780 DEBUG nova.scheduler.host_manager [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:54:42.523 143780 DEBUG nova.scheduler.host_manager [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 453, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 54, 41, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 54, 41, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:54:42.523 143780 DEBUG nova.scheduler.host_manager [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:54:42.523 143780 DEBUG oslo_concurrency.lockutils [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:42.523 143780 INFO nova.scheduler.host_manager [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:54:42.524 143780 DEBUG nova.scheduler.manager [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:54:42.524 143780 DEBUG nova.scheduler.manager [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:54:42.524 143780 DEBUG nova.scheduler.utils [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance a91e8911-08d1-4caf-bda4-70f0c23d8f0d claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:54:42.648 143780 DEBUG nova.scheduler.manager [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: a91e8911-08d1-4caf-bda4-70f0c23d8f0d] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:54:42.649 143780 DEBUG oslo_concurrency.lockutils [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:42.649 143780 DEBUG oslo_concurrency.lockutils [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:42.650 143780 DEBUG oslo_messaging._drivers.amqpdriver [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 31759f9957ba495598e6e2264b94882e NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:54:42.652 143780 DEBUG oslo_messaging._drivers.amqpdriver [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: 3b30d99f8d2f479ea45183040ba42111 reply queue: reply_c93f245d97184cab8c1af1830c6ac100 time elapsed: 0.24971365599958517s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:54:43.408 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:43.409 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:43.409 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:43.513 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: f98955b4759b44dea7511ba75410c940 reply to reply_c93f245d97184cab8c1af1830c6ac100 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:54:43.514 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:43.514 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ffc8806c6e394ee385036489161aa373 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:54:43.514 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:43.514 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:43.515 143779 DEBUG nova.scheduler.manager [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['d26c886b-10d7-497a-8c77-807d428a46d2'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:54:43.517 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:43.517 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:43.517 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:43.520 143779 DEBUG nova.scheduler.request_filter [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:54:43.520 143779 DEBUG nova.scheduler.request_filter [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:54:43.520 143779 DEBUG nova.scheduler.request_filter [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:54:43.521 143779 DEBUG nova.scheduler.request_filter [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:54:43.521 143779 DEBUG nova.scheduler.request_filter [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:54:43.525 143779 DEBUG oslo_concurrency.lockutils [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:43.526 143779 DEBUG oslo_concurrency.lockutils [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:43.575 143779 DEBUG oslo_messaging._drivers.amqpdriver [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 2ee4d86c01c44589ba1e9679ea135ead NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:54:43.577 143779 DEBUG oslo_concurrency.lockutils [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:43.577 143779 DEBUG oslo_concurrency.lockutils [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:43.585 143779 DEBUG oslo_concurrency.lockutils [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:43.585 143779 DEBUG nova.scheduler.host_manager [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=76,free_ram_mb=30363,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=1,mapped=1,memory_mb=31899,memory_mb_used=1536,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=1,service_id=None,stats={failed_builds='0',io_workload='1',num_instances='1',num_os_type_None='1',num_proj_8eee4ba3ae584a7f8f8facea3bccd094='1',num_task_None='1',num_vm_building='1'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:54:43Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=1) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:54:43.586 143779 DEBUG nova.scheduler.host_manager [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:54:43.586 143779 DEBUG nova.scheduler.host_manager [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 453, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 54, 41, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 54, 41, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:54:43.586 143779 DEBUG nova.scheduler.host_manager [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:54:43.587 143779 DEBUG oslo_concurrency.lockutils [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:43.587 143779 INFO nova.scheduler.host_manager [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:54:43.587 143779 DEBUG nova.scheduler.manager [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 26624MB io_ops: 1 instances: 1]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:54:43.587 143779 DEBUG nova.scheduler.manager [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 26624MB io_ops: 1 instances: 1, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:54:43.587 143779 DEBUG nova.scheduler.utils [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance d26c886b-10d7-497a-8c77-807d428a46d2 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:54:43.655 143779 DEBUG nova.scheduler.manager [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: d26c886b-10d7-497a-8c77-807d428a46d2] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 26624MB io_ops: 1 instances: 1 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:54:43.656 143779 DEBUG oslo_concurrency.lockutils [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:43.656 143779 DEBUG oslo_concurrency.lockutils [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:43.658 143779 DEBUG oslo_messaging._drivers.amqpdriver [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 86f09e0cbe694fafb2f61f56592fe9e9 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:54:43.660 143779 DEBUG oslo_messaging._drivers.amqpdriver [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: f98955b4759b44dea7511ba75410c940 reply queue: reply_c93f245d97184cab8c1af1830c6ac100 time elapsed: 0.1466565509999782s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:54:44.061 143780 DEBUG oslo_service.periodic_task [req-6db60410-8672-4d64-b445-cc6c8487f116 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:54:44.065 143780 DEBUG oslo_concurrency.lockutils [req-a196548b-9aca-4500-b6d1-485b7d32d8d6 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:44.066 143780 DEBUG oslo_concurrency.lockutils [req-a196548b-9aca-4500-b6d1-485b7d32d8d6 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:44.518 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:44.519 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:44.519 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:45.410 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:45.410 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:45.410 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:45.704 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: dffd6ab4d78741f98d49f15ac6fb1a88 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:54:45.704 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: dffd6ab4d78741f98d49f15ac6fb1a88 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:54:45.704 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: dffd6ab4d78741f98d49f15ac6fb1a88 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:54:45.704 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:45.704 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:45.704 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:45.704 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: dffd6ab4d78741f98d49f15ac6fb1a88 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:54:45.704 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: dffd6ab4d78741f98d49f15ac6fb1a88 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:54:45.704 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: dffd6ab4d78741f98d49f15ac6fb1a88 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:54:45.705 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:45.705 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:45.705 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:45.705 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:45.705 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:45.705 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:45.706 143787 DEBUG oslo_concurrency.lockutils [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:45.707 143780 DEBUG oslo_concurrency.lockutils [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:45.707 143787 DEBUG oslo_concurrency.lockutils [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:45.707 143780 DEBUG oslo_concurrency.lockutils [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:45.707 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:45.707 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:45.707 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:45.707 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:45.707 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:45.707 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:45.707 143781 DEBUG oslo_concurrency.lockutils [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:45.708 143781 DEBUG oslo_concurrency.lockutils [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:45.709 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: dffd6ab4d78741f98d49f15ac6fb1a88 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:54:45.709 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:45.709 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:45.709 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: dffd6ab4d78741f98d49f15ac6fb1a88 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:54:45.709 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:45.709 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:45.709 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:45.710 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:45.711 143779 DEBUG oslo_concurrency.lockutils [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:45.712 143779 DEBUG oslo_concurrency.lockutils [req-7e16b129-9cda-4915-8c48-7642c2a85176 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:45.713 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:45.713 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:45.713 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:46.611 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f960727af85e4bd3b2260d9114f7f70d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:54:46.611 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f960727af85e4bd3b2260d9114f7f70d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:54:46.611 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f960727af85e4bd3b2260d9114f7f70d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:54:46.611 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f960727af85e4bd3b2260d9114f7f70d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:54:46.611 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:46.611 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f960727af85e4bd3b2260d9114f7f70d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:54:46.611 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:46.611 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:46.611 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:46.611 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f960727af85e4bd3b2260d9114f7f70d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:54:46.611 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f960727af85e4bd3b2260d9114f7f70d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:54:46.612 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:46.611 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:46.612 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f960727af85e4bd3b2260d9114f7f70d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:54:46.612 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:46.612 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:46.612 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:46.612 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:46.612 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:46.612 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:46.613 143780 DEBUG oslo_concurrency.lockutils [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:46.614 143780 DEBUG oslo_concurrency.lockutils [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:46.614 143787 DEBUG oslo_concurrency.lockutils [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:46.614 143787 DEBUG oslo_concurrency.lockutils [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:46.614 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:46.614 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:46.614 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:46.614 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:46.615 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:46.615 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:46.614 143779 DEBUG oslo_concurrency.lockutils [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:46.615 143781 DEBUG oslo_concurrency.lockutils [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:46.615 143779 DEBUG oslo_concurrency.lockutils [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:46.615 143781 DEBUG oslo_concurrency.lockutils [req-de0c8e45-020a-43cc-afe9-e7e2244fcb6d 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:46.615 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:46.615 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:46.616 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:46.616 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:46.616 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:46.616 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:47.616 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:47.616 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:47.616 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:47.617 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:47.617 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:47.617 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:47.618 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:47.618 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:47.618 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:47.618 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:47.618 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:47.618 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:49.067 143779 DEBUG oslo_service.periodic_task [req-a798b454-a84b-4228-82fe-d26e44e13e22 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:54:49.071 143779 DEBUG oslo_concurrency.lockutils [req-bd75d2dc-b37a-4567-963f-f45668ab77c1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:49.072 143779 DEBUG oslo_concurrency.lockutils [req-bd75d2dc-b37a-4567-963f-f45668ab77c1 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:54:49.617 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:49.618 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:49.618 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:49.619 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:49.619 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:49.619 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:49.619 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:49.619 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:49.620 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:49.620 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:49.620 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:49.620 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:53.621 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:53.621 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:53.621 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:53.622 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:53.622 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:53.622 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:53.622 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:53.622 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:53.622 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:53.623 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:54:53.623 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:54:53.623 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:54:55.041 143781 DEBUG oslo_service.periodic_task [req-04a3bac1-268f-4a00-aca6-28f5e5f6aec8 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:54:55.045 143781 DEBUG oslo_concurrency.lockutils [req-b1c07ad8-bb43-4de3-afcf-3d66f3a3730a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:54:55.046 143781 DEBUG oslo_concurrency.lockutils [req-b1c07ad8-bb43-4de3-afcf-3d66f3a3730a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:01.624 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:01.625 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:01.625 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:01.625 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:01.625 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:01.625 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:01.626 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:01.626 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:01.626 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:01.626 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:01.626 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:01.626 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:02.092 143787 DEBUG oslo_service.periodic_task [req-9cfcfc38-fea1-416a-952b-30307f15eb42 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:55:02.096 143787 DEBUG oslo_concurrency.lockutils [req-d3b8e2e8-b125-4f26-8cc3-211ef98216f3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:02.096 143787 DEBUG oslo_concurrency.lockutils [req-d3b8e2e8-b125-4f26-8cc3-211ef98216f3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:06.576 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 13bbb6674c1c4e9eae51c4a51a77645b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:06.576 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 13bbb6674c1c4e9eae51c4a51a77645b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:06.576 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 13bbb6674c1c4e9eae51c4a51a77645b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:06.576 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 13bbb6674c1c4e9eae51c4a51a77645b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:06.576 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:06.576 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:06.576 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:06.576 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:06.576 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 13bbb6674c1c4e9eae51c4a51a77645b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:06.576 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 13bbb6674c1c4e9eae51c4a51a77645b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:06.576 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 13bbb6674c1c4e9eae51c4a51a77645b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:06.576 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 13bbb6674c1c4e9eae51c4a51a77645b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:06.577 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:06.577 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:06.577 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:06.577 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:06.577 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:06.577 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:06.577 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:06.577 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:06.577 143787 DEBUG oslo_concurrency.lockutils [req-daaee9c4-0504-42c0-8d52-64dba7f8e01a 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:06.577 143779 DEBUG oslo_concurrency.lockutils [req-daaee9c4-0504-42c0-8d52-64dba7f8e01a 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:06.577 143781 DEBUG oslo_concurrency.lockutils [req-daaee9c4-0504-42c0-8d52-64dba7f8e01a 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:06.577 143787 DEBUG oslo_concurrency.lockutils [req-daaee9c4-0504-42c0-8d52-64dba7f8e01a 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:06.577 143779 DEBUG oslo_concurrency.lockutils [req-daaee9c4-0504-42c0-8d52-64dba7f8e01a 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:06.577 143780 DEBUG oslo_concurrency.lockutils [req-daaee9c4-0504-42c0-8d52-64dba7f8e01a 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:06.578 143781 DEBUG oslo_concurrency.lockutils [req-daaee9c4-0504-42c0-8d52-64dba7f8e01a 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:06.578 143780 DEBUG oslo_concurrency.lockutils [req-daaee9c4-0504-42c0-8d52-64dba7f8e01a 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:06.579 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:06.579 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:06.579 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:06.579 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:06.579 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:06.579 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:06.579 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:06.579 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:06.579 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:06.579 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:06.579 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:06.579 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:06.637 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: e904a7288a614420b2b49e3040fdde9c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:06.637 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: e904a7288a614420b2b49e3040fdde9c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:06.637 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: e904a7288a614420b2b49e3040fdde9c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:06.637 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: e904a7288a614420b2b49e3040fdde9c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:06.637 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:06.637 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:06.637 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:06.638 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: e904a7288a614420b2b49e3040fdde9c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:06.638 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: e904a7288a614420b2b49e3040fdde9c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:06.638 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: e904a7288a614420b2b49e3040fdde9c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:06.638 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:06.638 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:06.638 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: e904a7288a614420b2b49e3040fdde9c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:06.638 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:06.638 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:06.638 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:06.638 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:06.638 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:06.638 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:06.638 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:06.638 143779 DEBUG oslo_concurrency.lockutils [req-ab33a19e-0b16-4ee7-9eee-8321c88da77f 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:06.639 143781 DEBUG oslo_concurrency.lockutils [req-ab33a19e-0b16-4ee7-9eee-8321c88da77f 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:06.639 143779 DEBUG oslo_concurrency.lockutils [req-ab33a19e-0b16-4ee7-9eee-8321c88da77f 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:06.639 143787 DEBUG oslo_concurrency.lockutils [req-ab33a19e-0b16-4ee7-9eee-8321c88da77f 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:06.639 143780 DEBUG oslo_concurrency.lockutils [req-ab33a19e-0b16-4ee7-9eee-8321c88da77f 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:06.639 143781 DEBUG oslo_concurrency.lockutils [req-ab33a19e-0b16-4ee7-9eee-8321c88da77f 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:06.639 143787 DEBUG oslo_concurrency.lockutils [req-ab33a19e-0b16-4ee7-9eee-8321c88da77f 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:06.639 143780 DEBUG oslo_concurrency.lockutils [req-ab33a19e-0b16-4ee7-9eee-8321c88da77f 1da9bd7b6a524f92b283bef176cd2f22 8eee4ba3ae584a7f8f8facea3bccd094 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:06.640 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:06.640 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:06.640 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:06.640 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:06.640 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:06.640 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:06.640 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:06.641 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:06.641 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:06.641 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:06.641 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:06.641 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:07.641 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:07.641 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:07.641 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:07.642 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:07.642 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:07.642 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:07.642 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:07.642 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:07.642 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:07.642 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:07.642 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:07.642 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:09.643 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:09.643 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:09.643 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:09.644 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:09.645 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:09.645 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:09.645 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:09.645 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:09.645 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:09.645 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:09.645 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:09.645 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:13.645 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:13.645 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:13.645 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:13.648 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:13.648 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:13.648 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:13.648 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:13.649 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:13.649 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:13.649 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:13.650 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:13.650 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:15.044 143780 DEBUG oslo_service.periodic_task [req-a196548b-9aca-4500-b6d1-485b7d32d8d6 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:55:15.049 143780 DEBUG oslo_concurrency.lockutils [req-4e17d3f6-94ea-46c0-b013-9a51c1cbd6e4 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:15.049 143780 DEBUG oslo_concurrency.lockutils [req-4e17d3f6-94ea-46c0-b013-9a51c1cbd6e4 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:16.920 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: dff90cd684c84e49b8a69a2126c29854 reply to reply_02dbf4ca539d4c419255761ef99731cf __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:55:16.920 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:16.920 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2ec552d568f44791adef6f93fd1ec927 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:16.921 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:16.921 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:16.922 143787 DEBUG nova.scheduler.manager [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['f5b64d3d-8a70-4e0f-9e59-7d2f279aa0d9'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:55:16.924 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:16.924 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:16.924 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:16.927 143787 DEBUG nova.scheduler.request_filter [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:55:16.927 143787 DEBUG nova.scheduler.request_filter [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:55:16.927 143787 DEBUG nova.scheduler.request_filter [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:55:16.928 143787 DEBUG nova.scheduler.request_filter [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:55:16.928 143787 DEBUG nova.scheduler.request_filter [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:55:16.932 143787 DEBUG oslo_concurrency.lockutils [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:16.932 143787 DEBUG oslo_concurrency.lockutils [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:17.041 143787 DEBUG oslo_messaging._drivers.amqpdriver [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 5be4fbd8092c48af909497c4d36c32ab NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:55:17.043 143787 DEBUG oslo_concurrency.lockutils [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:17.043 143787 DEBUG oslo_concurrency.lockutils [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:17.055 143787 DEBUG oslo_concurrency.lockutils [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:17.056 143787 DEBUG nova.scheduler.host_manager [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0',io_workload='0',num_instances='0',num_os_type_None='0',num_proj_8eee4ba3ae584a7f8f8facea3bccd094='0',num_task_None='0',num_vm_building='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:55:07Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:55:17.057 143787 DEBUG nova.scheduler.host_manager [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:55:17.057 143787 DEBUG nova.scheduler.host_manager [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 456, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 55, 11, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 55, 11, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:55:17.057 143787 DEBUG nova.scheduler.host_manager [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:55:17.057 143787 DEBUG oslo_concurrency.lockutils [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:17.057 143787 INFO nova.scheduler.host_manager [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:55:17.058 143787 DEBUG nova.scheduler.manager [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:55:17.058 143787 DEBUG nova.scheduler.manager [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:55:17.058 143787 DEBUG nova.scheduler.utils [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance f5b64d3d-8a70-4e0f-9e59-7d2f279aa0d9 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:55:17.146 143787 DEBUG nova.scheduler.manager [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: f5b64d3d-8a70-4e0f-9e59-7d2f279aa0d9] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 26624MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:55:17.146 143787 DEBUG oslo_concurrency.lockutils [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:17.147 143787 DEBUG oslo_concurrency.lockutils [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:17.148 143787 DEBUG oslo_messaging._drivers.amqpdriver [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: b1a5bf80cd9841b69bc214800515a14d NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:55:17.150 143787 DEBUG oslo_messaging._drivers.amqpdriver [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: dff90cd684c84e49b8a69a2126c29854 reply queue: reply_02dbf4ca539d4c419255761ef99731cf time elapsed: 0.23019444500005193s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:55:17.926 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:17.926 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:17.926 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:18.221 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: a1b00feeda414fd2bbfc2c3ea3d80920 reply to reply_02dbf4ca539d4c419255761ef99731cf __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:55:18.222 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:18.222 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 970a39e6278b426c8e988d7dd7e5d64a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:18.222 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:18.222 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:18.224 143781 DEBUG nova.scheduler.manager [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['7090e86d-06b3-4ec6-8c21-5e8d233ae480'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:55:18.226 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:18.226 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:18.226 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:18.229 143781 DEBUG nova.scheduler.request_filter [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:55:18.231 143781 DEBUG nova.scheduler.request_filter [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:55:18.231 143781 DEBUG nova.scheduler.request_filter [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:55:18.232 143781 DEBUG nova.scheduler.request_filter [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:55:18.232 143781 DEBUG nova.scheduler.request_filter [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:55:18.237 143781 DEBUG oslo_concurrency.lockutils [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:18.237 143781 DEBUG oslo_concurrency.lockutils [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:18.290 143781 DEBUG oslo_messaging._drivers.amqpdriver [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: a22762be3d1f43b3ae371fe8f76a8d5c NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:55:18.292 143781 DEBUG oslo_concurrency.lockutils [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:18.292 143781 DEBUG oslo_concurrency.lockutils [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:18.300 143781 DEBUG oslo_concurrency.lockutils [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:18.300 143781 DEBUG nova.scheduler.host_manager [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=26,free_disk_gb=76,free_ram_mb=30363,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=1,mapped=1,memory_mb=31899,memory_mb_used=1536,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=1,service_id=None,stats={failed_builds='0',io_workload='1',num_instances='1',num_os_type_None='1',num_proj_8eee4ba3ae584a7f8f8facea3bccd094='0',num_proj_fba0763e6b8e4c36b71c8af0815b5fc8='1',num_task_None='1',num_vm_building='1'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:55:17Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=1) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:55:18.301 143781 DEBUG nova.scheduler.host_manager [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:55:18.301 143781 DEBUG nova.scheduler.host_manager [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 456, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 55, 11, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 55, 11, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:55:18.301 143781 DEBUG nova.scheduler.host_manager [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:55:18.302 143781 DEBUG oslo_concurrency.lockutils [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:18.302 143781 INFO nova.scheduler.host_manager [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:55:18.302 143781 DEBUG nova.scheduler.manager [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 26624MB io_ops: 1 instances: 1]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:55:18.302 143781 DEBUG nova.scheduler.manager [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 26624MB io_ops: 1 instances: 1, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:55:18.302 143781 DEBUG nova.scheduler.utils [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance 7090e86d-06b3-4ec6-8c21-5e8d233ae480 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:55:18.368 143781 DEBUG nova.scheduler.manager [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: 7090e86d-06b3-4ec6-8c21-5e8d233ae480] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 26624MB io_ops: 1 instances: 1 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:55:18.368 143781 DEBUG oslo_concurrency.lockutils [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:18.369 143781 DEBUG oslo_concurrency.lockutils [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:18.371 143781 DEBUG oslo_messaging._drivers.amqpdriver [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: b4db37f9ac6a40bba7e76edfd5d2af55 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:55:18.373 143781 DEBUG oslo_messaging._drivers.amqpdriver [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: a1b00feeda414fd2bbfc2c3ea3d80920 reply queue: reply_02dbf4ca539d4c419255761ef99731cf time elapsed: 0.15080900999964797s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:55:19.227 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:19.228 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:19.228 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:19.927 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:19.928 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:19.928 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:20.057 143779 DEBUG oslo_service.periodic_task [req-bd75d2dc-b37a-4567-963f-f45668ab77c1 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:55:20.061 143779 DEBUG oslo_concurrency.lockutils [req-9bdab5b8-30a0-4593-bdb7-57edf9a67b92 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:20.062 143779 DEBUG oslo_concurrency.lockutils [req-9bdab5b8-30a0-4593-bdb7-57edf9a67b92 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:20.117 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 2ef3147ebd154bfda7f2c9994a698d30 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:20.117 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 2ef3147ebd154bfda7f2c9994a698d30 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:20.118 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 2ef3147ebd154bfda7f2c9994a698d30 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:20.118 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 2ef3147ebd154bfda7f2c9994a698d30 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:20.118 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:20.118 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:20.118 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:20.118 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2ef3147ebd154bfda7f2c9994a698d30 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:20.118 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2ef3147ebd154bfda7f2c9994a698d30 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:20.118 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:20.118 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2ef3147ebd154bfda7f2c9994a698d30 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:20.118 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:20.118 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2ef3147ebd154bfda7f2c9994a698d30 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:20.118 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:20.118 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:20.118 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:20.118 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:20.118 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:20.118 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:20.119 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:20.120 143781 DEBUG oslo_concurrency.lockutils [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:20.120 143779 DEBUG oslo_concurrency.lockutils [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:20.120 143780 DEBUG oslo_concurrency.lockutils [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:20.120 143781 DEBUG oslo_concurrency.lockutils [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:20.120 143779 DEBUG oslo_concurrency.lockutils [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:20.121 143780 DEBUG oslo_concurrency.lockutils [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:20.121 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:20.121 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:20.121 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:20.121 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:20.121 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:20.121 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:20.121 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:20.121 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:20.121 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:20.121 143787 DEBUG oslo_concurrency.lockutils [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:20.122 143787 DEBUG oslo_concurrency.lockutils [req-de1b599f-b8a3-4fbd-9feb-15148d140241 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:20.122 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:20.123 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:20.123 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:21.122 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:21.122 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:21.122 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:21.122 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:21.123 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:21.123 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:21.123 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:21.123 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:21.123 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:21.124 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:21.124 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:21.124 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:21.297 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c3362a042ff64748a195462e8f991b26 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:21.297 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c3362a042ff64748a195462e8f991b26 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:21.297 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c3362a042ff64748a195462e8f991b26 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:21.297 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:21.297 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:21.297 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c3362a042ff64748a195462e8f991b26 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:21.297 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:21.297 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c3362a042ff64748a195462e8f991b26 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:21.298 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c3362a042ff64748a195462e8f991b26 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:21.298 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c3362a042ff64748a195462e8f991b26 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:21.298 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:21.298 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:21.298 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:21.298 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:21.298 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c3362a042ff64748a195462e8f991b26 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:21.298 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:21.298 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:21.298 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:21.298 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:21.298 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:21.300 143781 DEBUG oslo_concurrency.lockutils [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:21.300 143787 DEBUG oslo_concurrency.lockutils [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:21.300 143781 DEBUG oslo_concurrency.lockutils [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:21.300 143787 DEBUG oslo_concurrency.lockutils [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:21.300 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:21.300 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:21.300 143779 DEBUG oslo_concurrency.lockutils [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:21.301 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:21.301 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:21.301 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:21.301 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:21.301 143780 DEBUG oslo_concurrency.lockutils [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:21.301 143779 DEBUG oslo_concurrency.lockutils [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:21.301 143780 DEBUG oslo_concurrency.lockutils [req-35591b95-4bb8-4657-b1c3-2b001ba0075b 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:21.301 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:21.301 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:21.301 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:21.301 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:21.301 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:21.301 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:22.302 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:22.302 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:22.302 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:22.302 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:22.302 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:22.302 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:22.303 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:22.303 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:22.303 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:22.303 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:22.303 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:22.303 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:24.304 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:24.304 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:24.304 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:24.305 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:24.305 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:24.305 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:24.305 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:24.305 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:24.306 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:24.306 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:24.306 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:24.306 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:26.041 143781 DEBUG oslo_service.periodic_task [req-b1c07ad8-bb43-4de3-afcf-3d66f3a3730a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:55:26.046 143781 DEBUG oslo_concurrency.lockutils [req-5b1cc8c5-ed46-4f30-b18b-6d98b55dd2d6 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:26.046 143781 DEBUG oslo_concurrency.lockutils [req-5b1cc8c5-ed46-4f30-b18b-6d98b55dd2d6 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:28.307 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:28.308 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:28.308 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:28.308 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:28.308 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:28.308 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:28.308 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:28.308 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:28.308 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:28.309 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:28.310 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:28.310 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:33.083 143787 DEBUG oslo_service.periodic_task [req-d3b8e2e8-b125-4f26-8cc3-211ef98216f3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:55:33.088 143787 DEBUG oslo_concurrency.lockutils [req-64e2c892-2550-450d-9db4-5c97aad4de69 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:33.088 143787 DEBUG oslo_concurrency.lockutils [req-64e2c892-2550-450d-9db4-5c97aad4de69 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:36.311 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:36.312 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:36.312 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:36.312 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:36.313 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:36.313 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:36.314 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:36.314 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:36.314 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:36.317 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:36.317 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:36.317 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:38.052 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: faa295ac8dac40968998834ce702c80f __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:38.052 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: faa295ac8dac40968998834ce702c80f __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:38.052 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: faa295ac8dac40968998834ce702c80f __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:38.052 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: faa295ac8dac40968998834ce702c80f __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:38.052 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:38.052 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:38.052 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:38.052 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: faa295ac8dac40968998834ce702c80f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:38.052 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: faa295ac8dac40968998834ce702c80f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:38.052 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: faa295ac8dac40968998834ce702c80f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:38.052 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:38.052 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:38.053 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:38.053 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:38.053 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: faa295ac8dac40968998834ce702c80f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:38.053 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:38.053 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:38.053 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:38.053 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:38.053 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:38.053 143781 DEBUG oslo_concurrency.lockutils [req-6f1a6b6c-65a2-455e-89a0-5c860a1f90f8 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:38.053 143780 DEBUG oslo_concurrency.lockutils [req-6f1a6b6c-65a2-455e-89a0-5c860a1f90f8 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:38.053 143781 DEBUG oslo_concurrency.lockutils [req-6f1a6b6c-65a2-455e-89a0-5c860a1f90f8 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:38.053 143787 DEBUG oslo_concurrency.lockutils [req-6f1a6b6c-65a2-455e-89a0-5c860a1f90f8 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:38.054 143780 DEBUG oslo_concurrency.lockutils [req-6f1a6b6c-65a2-455e-89a0-5c860a1f90f8 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:38.054 143787 DEBUG oslo_concurrency.lockutils [req-6f1a6b6c-65a2-455e-89a0-5c860a1f90f8 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:38.054 143779 DEBUG oslo_concurrency.lockutils [req-6f1a6b6c-65a2-455e-89a0-5c860a1f90f8 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:38.054 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:38.054 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:38.054 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:38.054 143779 DEBUG oslo_concurrency.lockutils [req-6f1a6b6c-65a2-455e-89a0-5c860a1f90f8 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:38.055 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:38.055 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:38.055 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:38.055 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:38.056 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:38.056 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:38.056 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:38.056 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:38.056 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:38.093 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 72b7d53a09824f3fa581dcacbbfed28c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:38.093 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 72b7d53a09824f3fa581dcacbbfed28c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:38.093 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 72b7d53a09824f3fa581dcacbbfed28c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:38.093 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 72b7d53a09824f3fa581dcacbbfed28c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:38.093 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:38.093 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:38.093 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 72b7d53a09824f3fa581dcacbbfed28c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:38.093 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 72b7d53a09824f3fa581dcacbbfed28c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:38.093 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:38.093 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:38.093 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:38.093 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 72b7d53a09824f3fa581dcacbbfed28c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:38.094 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:38.094 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:38.094 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:38.094 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:38.094 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 72b7d53a09824f3fa581dcacbbfed28c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:38.094 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:38.094 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:38.094 143781 DEBUG oslo_concurrency.lockutils [req-852939f2-4668-4e7e-b5ca-d99e705f585f 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:38.094 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:38.094 143781 DEBUG oslo_concurrency.lockutils [req-852939f2-4668-4e7e-b5ca-d99e705f585f 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:38.094 143779 DEBUG oslo_concurrency.lockutils [req-852939f2-4668-4e7e-b5ca-d99e705f585f 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:38.095 143780 DEBUG oslo_concurrency.lockutils [req-852939f2-4668-4e7e-b5ca-d99e705f585f 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:38.095 143779 DEBUG oslo_concurrency.lockutils [req-852939f2-4668-4e7e-b5ca-d99e705f585f 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:38.095 143780 DEBUG oslo_concurrency.lockutils [req-852939f2-4668-4e7e-b5ca-d99e705f585f 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:38.095 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:38.095 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:38.095 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:38.095 143787 DEBUG oslo_concurrency.lockutils [req-852939f2-4668-4e7e-b5ca-d99e705f585f 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:38.096 143787 DEBUG oslo_concurrency.lockutils [req-852939f2-4668-4e7e-b5ca-d99e705f585f 6690cfd38c3a4cb7afb3481ae73dc1e2 fba0763e6b8e4c36b71c8af0815b5fc8 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:38.096 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:38.096 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:38.096 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:38.096 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:38.096 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:38.096 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:38.096 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:38.097 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:38.097 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:39.097 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:39.097 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:39.097 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:39.097 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:39.097 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:39.097 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:39.097 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:39.098 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:39.098 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:39.098 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:39.098 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:39.098 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:41.099 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:41.099 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:41.100 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:41.100 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:41.100 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:41.100 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:41.100 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:41.100 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:41.101 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:41.101 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:41.101 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:41.101 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:45.056 143780 DEBUG oslo_service.periodic_task [req-4e17d3f6-94ea-46c0-b013-9a51c1cbd6e4 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:55:45.059 143780 DEBUG oslo_concurrency.lockutils [req-05f4ac88-1916-4b47-a52d-c57add3596fb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:45.060 143780 DEBUG oslo_concurrency.lockutils [req-05f4ac88-1916-4b47-a52d-c57add3596fb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:45.102 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:45.103 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:45.103 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:45.103 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:45.103 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:45.104 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:45.104 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:45.104 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:45.105 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:45.105 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:45.105 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:45.105 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:50.069 143779 DEBUG oslo_service.periodic_task [req-9bdab5b8-30a0-4593-bdb7-57edf9a67b92 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:55:50.073 143779 DEBUG oslo_concurrency.lockutils [req-89d5c959-33c4-4813-95ad-ab68bbb23119 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:50.074 143779 DEBUG oslo_concurrency.lockutils [req-89d5c959-33c4-4813-95ad-ab68bbb23119 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:52.199 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: e96f95a11c704306a2d7b7dd4a397ba9 reply to reply_02dbf4ca539d4c419255761ef99731cf __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:55:52.200 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:52.200 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 0ba32c4347604e8a9f8a84ef371a78be poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:52.200 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:52.201 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:52.203 143780 DEBUG nova.scheduler.manager [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['2de62fc7-52bf-481d-be83-6c0ff375521e'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:55:52.205 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:52.205 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:52.206 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:52.209 143780 DEBUG nova.scheduler.request_filter [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:55:52.209 143780 DEBUG nova.scheduler.request_filter [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:55:52.210 143780 DEBUG nova.scheduler.request_filter [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:55:52.210 143780 DEBUG nova.scheduler.request_filter [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:55:52.210 143780 DEBUG nova.scheduler.request_filter [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:55:52.215 143780 DEBUG oslo_concurrency.lockutils [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:52.216 143780 DEBUG oslo_concurrency.lockutils [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:52.270 143780 DEBUG oslo_messaging._drivers.amqpdriver [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: b6a1f1a4fc6c4ae9b2cfa47912947a9f NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:55:52.272 143780 DEBUG oslo_concurrency.lockutils [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:52.272 143780 DEBUG oslo_concurrency.lockutils [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:52.281 143780 DEBUG oslo_concurrency.lockutils [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:52.281 143780 DEBUG nova.scheduler.host_manager [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=24,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0',io_workload='0',num_instances='0',num_os_type_None='0',num_proj_fba0763e6b8e4c36b71c8af0815b5fc8='0',num_task_None='0',num_vm_active='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:55:38Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:55:52.282 143780 DEBUG nova.scheduler.host_manager [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:55:52.282 143780 DEBUG nova.scheduler.host_manager [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 460, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 55, 51, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 55, 51, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:55:52.282 143780 DEBUG nova.scheduler.host_manager [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:55:52.282 143780 DEBUG oslo_concurrency.lockutils [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:52.283 143780 INFO nova.scheduler.host_manager [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:55:52.283 143780 DEBUG nova.scheduler.manager [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:55:52.283 143780 DEBUG nova.scheduler.manager [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:55:52.283 143780 DEBUG nova.scheduler.utils [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance 2de62fc7-52bf-481d-be83-6c0ff375521e claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:55:52.375 143780 DEBUG nova.scheduler.manager [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: 2de62fc7-52bf-481d-be83-6c0ff375521e] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:55:52.375 143780 DEBUG oslo_concurrency.lockutils [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:52.376 143780 DEBUG oslo_concurrency.lockutils [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:52.377 143780 DEBUG oslo_messaging._drivers.amqpdriver [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 5ccaec52f7154f3ca1a37857ddc2f79e NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:55:52.378 143780 DEBUG oslo_messaging._drivers.amqpdriver [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: e96f95a11c704306a2d7b7dd4a397ba9 reply queue: reply_02dbf4ca539d4c419255761ef99731cf time elapsed: 0.17873767600030988s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:55:53.105 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:53.106 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:53.106 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:53.107 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:53.107 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:53.107 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:53.107 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:53.107 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:53.108 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:53.207 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:53.207 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:53.207 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:53.396 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 664119c38918493eb91854c114e7d929 reply to reply_c93f245d97184cab8c1af1830c6ac100 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:55:53.396 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:53.396 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1350972665df4a63aec6bcdacc78f482 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:53.396 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:53.396 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:53.398 143779 DEBUG nova.scheduler.manager [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['0eb5e190-8389-4bed-93b1-2f0cab73912d'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:55:53.400 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:53.400 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:53.400 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:53.403 143779 DEBUG nova.scheduler.request_filter [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:55:53.403 143779 DEBUG nova.scheduler.request_filter [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:55:53.403 143779 DEBUG nova.scheduler.request_filter [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:55:53.403 143779 DEBUG nova.scheduler.request_filter [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:55:53.404 143779 DEBUG nova.scheduler.request_filter [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:55:53.408 143779 DEBUG oslo_concurrency.lockutils [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:53.409 143779 DEBUG oslo_concurrency.lockutils [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:53.464 143779 DEBUG oslo_messaging._drivers.amqpdriver [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: f5bccd1e2e114e04935223b8112e133b NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:55:53.466 143779 DEBUG oslo_concurrency.lockutils [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:53.467 143779 DEBUG oslo_concurrency.lockutils [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:53.475 143779 DEBUG oslo_concurrency.lockutils [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:53.476 143779 DEBUG nova.scheduler.host_manager [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=24,free_disk_gb=76,free_ram_mb=30363,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=1,mapped=1,memory_mb=31899,memory_mb_used=1536,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=1,service_id=None,stats={failed_builds='0',io_workload='1',num_instances='1',num_os_type_None='1',num_proj_98217cb9e91140b6b8528e7ee5b554c7='1',num_proj_fba0763e6b8e4c36b71c8af0815b5fc8='0',num_task_None='1',num_vm_active='0',num_vm_building='1'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:55:53Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=1) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:55:53.477 143779 DEBUG nova.scheduler.host_manager [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:55:53.477 143779 DEBUG nova.scheduler.host_manager [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 460, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 55, 51, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 55, 51, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:55:53.477 143779 DEBUG nova.scheduler.host_manager [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:55:53.477 143779 DEBUG oslo_concurrency.lockutils [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:53.478 143779 INFO nova.scheduler.host_manager [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:55:53.478 143779 DEBUG nova.scheduler.manager [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 24576MB io_ops: 1 instances: 1]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:55:53.478 143779 DEBUG nova.scheduler.manager [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 24576MB io_ops: 1 instances: 1, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:55:53.478 143779 DEBUG nova.scheduler.utils [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance 0eb5e190-8389-4bed-93b1-2f0cab73912d claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:55:53.549 143779 DEBUG nova.scheduler.manager [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: 0eb5e190-8389-4bed-93b1-2f0cab73912d] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 24576MB io_ops: 1 instances: 1 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:55:53.551 143779 DEBUG oslo_concurrency.lockutils [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:53.554 143779 DEBUG oslo_concurrency.lockutils [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.003s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:53.556 143779 DEBUG oslo_messaging._drivers.amqpdriver [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 2ffe812487ab44569f80e4693c64fb6f NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:55:53.558 143779 DEBUG oslo_messaging._drivers.amqpdriver [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: 664119c38918493eb91854c114e7d929 reply queue: reply_c93f245d97184cab8c1af1830c6ac100 time elapsed: 0.16155701099978614s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:55:54.401 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:54.402 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:54.402 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:55.209 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:55.210 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:55.210 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:55.518 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 97e990a723f74b399a9066015db4ef2d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:55.518 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 97e990a723f74b399a9066015db4ef2d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:55.518 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 97e990a723f74b399a9066015db4ef2d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:55.518 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 97e990a723f74b399a9066015db4ef2d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:55.518 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:55.518 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:55.519 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 97e990a723f74b399a9066015db4ef2d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:55.519 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 97e990a723f74b399a9066015db4ef2d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:55.519 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:55.519 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:55.519 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:55.519 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 97e990a723f74b399a9066015db4ef2d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:55.519 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:55.519 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:55.519 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 97e990a723f74b399a9066015db4ef2d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:55.519 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:55.519 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:55.519 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:55.519 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:55.520 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:55.521 143780 DEBUG oslo_concurrency.lockutils [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:55.521 143787 DEBUG oslo_concurrency.lockutils [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:55.521 143780 DEBUG oslo_concurrency.lockutils [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:55.521 143787 DEBUG oslo_concurrency.lockutils [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:55.522 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:55.522 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:55.522 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:55.522 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:55.522 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:55.522 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:55.522 143779 DEBUG oslo_concurrency.lockutils [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:55.522 143781 DEBUG oslo_concurrency.lockutils [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:55.522 143779 DEBUG oslo_concurrency.lockutils [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:55.522 143781 DEBUG oslo_concurrency.lockutils [req-75523f8d-98ce-4573-818f-4a3756bd852b ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:55.523 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:55.523 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:55.523 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:55.523 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:55.523 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:55.523 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:56.524 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:56.524 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:56.524 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:56.524 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:56.524 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:56.524 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:56.524 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:56.524 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:56.525 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:56.525 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:56.525 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:56.525 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:56.716 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 82685e7dfaa8462dbabe87bd983c7670 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:56.716 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 82685e7dfaa8462dbabe87bd983c7670 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:56.716 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 82685e7dfaa8462dbabe87bd983c7670 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:56.716 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 82685e7dfaa8462dbabe87bd983c7670 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:55:56.716 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:56.716 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:56.716 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:56.716 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 82685e7dfaa8462dbabe87bd983c7670 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:56.716 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 82685e7dfaa8462dbabe87bd983c7670 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:56.716 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 82685e7dfaa8462dbabe87bd983c7670 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:56.716 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:56.716 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:56.716 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:56.716 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 82685e7dfaa8462dbabe87bd983c7670 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:56.717 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:56.717 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:56.717 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:56.717 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:56.717 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:56.717 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:56.719 143779 DEBUG oslo_concurrency.lockutils [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:56.719 143779 DEBUG oslo_concurrency.lockutils [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:56.719 143781 DEBUG oslo_concurrency.lockutils [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:56.719 143780 DEBUG oslo_concurrency.lockutils [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:56.719 143787 DEBUG oslo_concurrency.lockutils [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:56.719 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:56.719 143781 DEBUG oslo_concurrency.lockutils [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:56.719 143787 DEBUG oslo_concurrency.lockutils [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:56.719 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:56.719 143780 DEBUG oslo_concurrency.lockutils [req-67274776-3241-4345-9572-03d8c86a8226 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:56.720 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:56.720 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:56.720 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:56.720 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:56.720 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:56.720 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:56.720 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:56.720 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:56.724 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:56.724 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:56.779 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 2a11c92a914444ee98975d0bb9b59685 reply to reply_02dbf4ca539d4c419255761ef99731cf __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:55:56.779 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:56.779 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f803f4558ab24f0cac14a450abca4a53 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:55:56.780 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:56.780 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:56.782 143787 DEBUG nova.scheduler.manager [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['2f34a4ea-7c57-4a88-9530-3a2604165b4e'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:55:56.783 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:56.783 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:56.783 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:56.786 143787 DEBUG nova.scheduler.request_filter [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:55:56.786 143787 DEBUG nova.scheduler.request_filter [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:55:56.786 143787 DEBUG nova.scheduler.request_filter [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:55:56.787 143787 DEBUG nova.scheduler.request_filter [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:55:56.787 143787 DEBUG nova.scheduler.request_filter [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:55:56.791 143787 DEBUG oslo_concurrency.lockutils [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:56.791 143787 DEBUG oslo_concurrency.lockutils [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:56.842 143787 DEBUG oslo_messaging._drivers.amqpdriver [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 111ad2194cbd486b90af5f74c576a954 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:55:56.844 143787 DEBUG oslo_concurrency.lockutils [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:56.844 143787 DEBUG oslo_concurrency.lockutils [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:56.853 143787 DEBUG oslo_concurrency.lockutils [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:56.854 143787 DEBUG nova.scheduler.host_manager [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=24,free_disk_gb=75,free_ram_mb=29339,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=2,mapped=1,memory_mb=31899,memory_mb_used=2560,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=2,service_id=None,stats={failed_builds='0',io_workload='2',num_instances='2',num_os_type_None='2',num_proj_98217cb9e91140b6b8528e7ee5b554c7='2',num_proj_fba0763e6b8e4c36b71c8af0815b5fc8='0',num_task_None='2',num_vm_active='0',num_vm_building='2'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:55:54Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=2) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:55:56.854 143787 DEBUG nova.scheduler.host_manager [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:55:56.855 143787 DEBUG nova.scheduler.host_manager [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 460, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 55, 51, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 55, 51, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:55:56.855 143787 DEBUG nova.scheduler.host_manager [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: ['2de62fc7-52bf-481d-be83-6c0ff375521e', '0eb5e190-8389-4bed-93b1-2f0cab73912d'] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:55:56.855 143787 DEBUG oslo_concurrency.lockutils [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:56.855 143787 INFO nova.scheduler.host_manager [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:55:56.855 143787 DEBUG nova.scheduler.manager [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 29339MB disk: 24576MB io_ops: 2 instances: 2]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:55:56.856 143787 DEBUG nova.scheduler.manager [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 29339MB disk: 24576MB io_ops: 2 instances: 2, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:55:56.856 143787 DEBUG nova.scheduler.utils [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance 2f34a4ea-7c57-4a88-9530-3a2604165b4e claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:55:56.929 143787 DEBUG nova.scheduler.manager [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: 2f34a4ea-7c57-4a88-9530-3a2604165b4e] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 29339MB disk: 24576MB io_ops: 2 instances: 2 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:55:56.930 143787 DEBUG oslo_concurrency.lockutils [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:56.930 143787 DEBUG oslo_concurrency.lockutils [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:56.931 143787 DEBUG oslo_messaging._drivers.amqpdriver [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 44332ca388b0419e90272420fabe9d71 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:55:56.934 143787 DEBUG oslo_messaging._drivers.amqpdriver [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: 2a11c92a914444ee98975d0bb9b59685 reply queue: reply_02dbf4ca539d4c419255761ef99731cf time elapsed: 0.15451099899928522s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:55:57.042 143781 DEBUG oslo_service.periodic_task [req-5b1cc8c5-ed46-4f30-b18b-6d98b55dd2d6 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:55:57.046 143781 DEBUG oslo_concurrency.lockutils [req-ffb4c044-c60c-4a70-acb0-46f0ad792d71 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:55:57.046 143781 DEBUG oslo_concurrency.lockutils [req-ffb4c044-c60c-4a70-acb0-46f0ad792d71 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:55:57.721 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:57.721 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:57.721 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:57.721 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:57.722 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:57.722 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:57.725 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:57.726 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:57.726 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:57.785 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:57.785 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:57.785 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:59.723 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:59.724 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:59.724 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:59.724 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:59.724 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:59.724 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:59.729 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:59.729 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:59.729 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:55:59.787 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:55:59.788 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:55:59.788 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:02.432 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 120793e4130e4693a9105eaa32be6942 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:56:02.432 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 120793e4130e4693a9105eaa32be6942 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:56:02.432 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:02.432 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 120793e4130e4693a9105eaa32be6942 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:56:02.432 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:02.433 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 120793e4130e4693a9105eaa32be6942 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:56:02.433 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:02.433 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:02.433 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:02.433 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:02.435 143779 DEBUG oslo_concurrency.lockutils [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:56:02.435 143779 DEBUG oslo_concurrency.lockutils [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:56:02.435 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:02.435 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:02.435 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:02.435 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 120793e4130e4693a9105eaa32be6942 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:56:02.436 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:02.436 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 120793e4130e4693a9105eaa32be6942 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:56:02.436 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:02.436 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:02.436 143780 DEBUG oslo_concurrency.lockutils [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:56:02.437 143780 DEBUG oslo_concurrency.lockutils [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:56:02.437 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:02.437 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:02.437 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:02.438 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 120793e4130e4693a9105eaa32be6942 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:56:02.439 143787 DEBUG oslo_concurrency.lockutils [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:56:02.439 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:02.439 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 120793e4130e4693a9105eaa32be6942 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:56:02.439 143787 DEBUG oslo_concurrency.lockutils [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:56:02.439 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:02.439 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:02.439 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:02.440 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:02.440 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:02.441 143781 DEBUG oslo_concurrency.lockutils [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:56:02.442 143781 DEBUG oslo_concurrency.lockutils [req-18708f2f-dc22-4d8e-9427-e5a08d78f3d1 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:56:02.442 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:02.442 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:02.442 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:03.093 143787 DEBUG oslo_service.periodic_task [req-64e2c892-2550-450d-9db4-5c97aad4de69 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:56:03.098 143787 DEBUG oslo_concurrency.lockutils [req-cddb7452-8632-46fa-97b9-0adff3b3ecc0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:56:03.099 143787 DEBUG oslo_concurrency.lockutils [req-cddb7452-8632-46fa-97b9-0adff3b3ecc0 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:56:03.437 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:03.437 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:03.437 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:03.438 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:03.439 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:03.439 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:03.440 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:03.441 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:03.441 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:03.443 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:03.444 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:03.444 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:05.440 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:05.440 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:05.440 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:05.440 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:05.441 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:05.441 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:05.443 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:05.443 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:05.443 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:05.446 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:05.446 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:05.447 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:09.442 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:09.442 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:09.443 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:09.444 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:09.445 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:09.445 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:09.445 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:09.445 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:09.446 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:09.449 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:09.449 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:09.449 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:15.070 143780 DEBUG oslo_service.periodic_task [req-05f4ac88-1916-4b47-a52d-c57add3596fb - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:56:15.073 143780 DEBUG oslo_concurrency.lockutils [req-3db28606-16ca-4ca7-9500-835b6a71a18d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:56:15.074 143780 DEBUG oslo_concurrency.lockutils [req-3db28606-16ca-4ca7-9500-835b6a71a18d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:56:17.444 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:17.444 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:17.444 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:17.446 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:17.446 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:17.446 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:17.448 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:17.449 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:17.449 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:17.451 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:17.452 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:17.452 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:20.996 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 89903c0323b24375afccbc3d3db81f4d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:56:20.996 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 89903c0323b24375afccbc3d3db81f4d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:56:20.996 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 89903c0323b24375afccbc3d3db81f4d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:56:20.997 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:20.997 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:20.997 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:20.997 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 89903c0323b24375afccbc3d3db81f4d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:56:20.997 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 89903c0323b24375afccbc3d3db81f4d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:56:20.997 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 89903c0323b24375afccbc3d3db81f4d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:56:20.997 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:20.997 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:20.997 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:20.997 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:20.997 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:20.997 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:20.998 143780 DEBUG oslo_concurrency.lockutils [req-6f5caf64-6396-4a43-9ff0-f4e77050ee03 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:56:20.998 143779 DEBUG oslo_concurrency.lockutils [req-6f5caf64-6396-4a43-9ff0-f4e77050ee03 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:56:20.998 143781 DEBUG oslo_concurrency.lockutils [req-6f5caf64-6396-4a43-9ff0-f4e77050ee03 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:56:20.998 143780 DEBUG nova.scheduler.host_manager [req-6f5caf64-6396-4a43-9ff0-f4e77050ee03 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:56:20.998 143779 DEBUG nova.scheduler.host_manager [req-6f5caf64-6396-4a43-9ff0-f4e77050ee03 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:56:20.998 143781 DEBUG nova.scheduler.host_manager [req-6f5caf64-6396-4a43-9ff0-f4e77050ee03 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:56:20.998 143780 DEBUG oslo_concurrency.lockutils [req-6f5caf64-6396-4a43-9ff0-f4e77050ee03 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:56:20.998 143779 DEBUG oslo_concurrency.lockutils [req-6f5caf64-6396-4a43-9ff0-f4e77050ee03 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:56:20.998 143781 DEBUG oslo_concurrency.lockutils [req-6f5caf64-6396-4a43-9ff0-f4e77050ee03 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:56:20.999 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:20.999 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:20.999 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:20.999 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:20.999 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:20.999 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:20.999 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:20.999 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:20.999 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:21.000 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 89903c0323b24375afccbc3d3db81f4d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:56:21.000 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:21.000 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 89903c0323b24375afccbc3d3db81f4d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:56:21.000 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:21.000 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:21.001 143787 DEBUG oslo_concurrency.lockutils [req-6f5caf64-6396-4a43-9ff0-f4e77050ee03 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:56:21.001 143787 DEBUG nova.scheduler.host_manager [req-6f5caf64-6396-4a43-9ff0-f4e77050ee03 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:56:21.001 143787 DEBUG oslo_concurrency.lockutils [req-6f5caf64-6396-4a43-9ff0-f4e77050ee03 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:56:21.002 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:21.002 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:21.002 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:21.056 143779 DEBUG oslo_service.periodic_task [req-89d5c959-33c4-4813-95ad-ab68bbb23119 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:56:21.061 143779 DEBUG oslo_concurrency.lockutils [req-962f47a0-a388-4323-9679-3383cb79f24f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:56:21.061 143779 DEBUG oslo_concurrency.lockutils [req-962f47a0-a388-4323-9679-3383cb79f24f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:56:21.999 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:22.000 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:22.000 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:22.000 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:22.000 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:22.000 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:22.001 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:22.001 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:22.001 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:22.003 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:22.003 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:22.003 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:24.001 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:24.002 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:24.002 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:24.002 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:24.003 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:24.003 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:24.003 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:24.003 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:24.003 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:24.006 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:24.006 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:24.006 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:27.052 143781 DEBUG oslo_service.periodic_task [req-ffb4c044-c60c-4a70-acb0-46f0ad792d71 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:56:27.056 143781 DEBUG oslo_concurrency.lockutils [req-d5fe6c52-aa75-4997-ae18-13eeb03e0f06 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:56:27.056 143781 DEBUG oslo_concurrency.lockutils [req-d5fe6c52-aa75-4997-ae18-13eeb03e0f06 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:56:28.004 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:28.004 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:28.005 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:28.005 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:28.005 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:28.005 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:28.005 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:28.006 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:28.006 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:28.008 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:28.008 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:28.008 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:34.085 143787 DEBUG oslo_service.periodic_task [req-cddb7452-8632-46fa-97b9-0adff3b3ecc0 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:56:34.093 143787 DEBUG oslo_concurrency.lockutils [req-9838b897-1ac6-471a-ad4f-8170e0659193 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:56:34.093 143787 DEBUG oslo_concurrency.lockutils [req-9838b897-1ac6-471a-ad4f-8170e0659193 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:56:36.007 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:36.007 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:36.007 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:36.009 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:36.010 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:36.009 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:36.010 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:36.010 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:36.010 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:36.011 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:36.011 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:36.011 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:45.087 143780 DEBUG oslo_service.periodic_task [req-3db28606-16ca-4ca7-9500-835b6a71a18d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:56:45.091 143780 DEBUG oslo_concurrency.lockutils [req-ec0f715c-ce41-4977-8a20-6646d92f1dab - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:56:45.091 143780 DEBUG oslo_concurrency.lockutils [req-ec0f715c-ce41-4977-8a20-6646d92f1dab - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:56:52.009 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:52.010 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:52.010 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:52.011 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:52.011 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:52.012 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:52.012 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:52.012 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:52.012 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:52.012 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:56:52.013 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:56:52.013 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:56:52.056 143779 DEBUG oslo_service.periodic_task [req-962f47a0-a388-4323-9679-3383cb79f24f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:56:52.060 143779 DEBUG oslo_concurrency.lockutils [req-9b53b76c-bf74-4d3f-ba96-0061eb3ad5f4 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:56:52.061 143779 DEBUG oslo_concurrency.lockutils [req-9b53b76c-bf74-4d3f-ba96-0061eb3ad5f4 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:56:57.069 143781 DEBUG oslo_service.periodic_task [req-d5fe6c52-aa75-4997-ae18-13eeb03e0f06 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:56:57.073 143781 DEBUG oslo_concurrency.lockutils [req-fbcbf975-8e3a-49e3-b8a8-15a036dc6da2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:56:57.074 143781 DEBUG oslo_concurrency.lockutils [req-fbcbf975-8e3a-49e3-b8a8-15a036dc6da2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:04.868 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 9c22d40a958c44d8a6acd79918f736df __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:57:04.868 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 9c22d40a958c44d8a6acd79918f736df __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:57:04.868 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 9c22d40a958c44d8a6acd79918f736df __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:57:04.868 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 9c22d40a958c44d8a6acd79918f736df __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:57:04.868 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:04.868 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9c22d40a958c44d8a6acd79918f736df poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:57:04.868 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:04.868 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:04.868 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:04.868 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9c22d40a958c44d8a6acd79918f736df poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:57:04.868 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9c22d40a958c44d8a6acd79918f736df poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:57:04.868 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 9c22d40a958c44d8a6acd79918f736df poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:57:04.869 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:04.869 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:04.869 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:04.869 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:04.869 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:04.869 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:04.869 143781 DEBUG oslo_concurrency.lockutils [req-6be6684a-bc8f-41b6-a554-38c36968ec33 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:04.869 143780 DEBUG oslo_concurrency.lockutils [req-6be6684a-bc8f-41b6-a554-38c36968ec33 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:04.870 143779 DEBUG oslo_concurrency.lockutils [req-6be6684a-bc8f-41b6-a554-38c36968ec33 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:04.870 143781 DEBUG oslo_concurrency.lockutils [req-6be6684a-bc8f-41b6-a554-38c36968ec33 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:04.870 143780 DEBUG oslo_concurrency.lockutils [req-6be6684a-bc8f-41b6-a554-38c36968ec33 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:04.870 143779 DEBUG oslo_concurrency.lockutils [req-6be6684a-bc8f-41b6-a554-38c36968ec33 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:04.868 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:04.870 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:04.871 143787 DEBUG oslo_concurrency.lockutils [req-6be6684a-bc8f-41b6-a554-38c36968ec33 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:04.871 143787 DEBUG oslo_concurrency.lockutils [req-6be6684a-bc8f-41b6-a554-38c36968ec33 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:04.871 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:04.871 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:04.871 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:04.871 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:04.871 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:04.871 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:04.871 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:04.871 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:04.872 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:04.872 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:04.873 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:04.873 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:05.055 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: a91e8a589dad4ae0888bf2114e2bd389 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:57:05.055 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: a91e8a589dad4ae0888bf2114e2bd389 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:57:05.055 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: a91e8a589dad4ae0888bf2114e2bd389 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:57:05.055 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: a91e8a589dad4ae0888bf2114e2bd389 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:57:05.055 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:05.055 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:05.055 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:05.055 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: a91e8a589dad4ae0888bf2114e2bd389 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:57:05.055 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:05.056 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: a91e8a589dad4ae0888bf2114e2bd389 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:57:05.056 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: a91e8a589dad4ae0888bf2114e2bd389 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:57:05.056 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: a91e8a589dad4ae0888bf2114e2bd389 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:57:05.056 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:05.056 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:05.056 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:05.056 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:05.056 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:05.056 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:05.056 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:05.056 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:05.056 143779 DEBUG oslo_concurrency.lockutils [req-526343d9-b0a6-48a9-be6b-8a156d6ba570 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:05.056 143780 DEBUG oslo_concurrency.lockutils [req-526343d9-b0a6-48a9-be6b-8a156d6ba570 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:05.057 143779 DEBUG oslo_concurrency.lockutils [req-526343d9-b0a6-48a9-be6b-8a156d6ba570 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:05.057 143780 DEBUG oslo_concurrency.lockutils [req-526343d9-b0a6-48a9-be6b-8a156d6ba570 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:05.057 143781 DEBUG oslo_concurrency.lockutils [req-526343d9-b0a6-48a9-be6b-8a156d6ba570 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:05.057 143787 DEBUG oslo_concurrency.lockutils [req-526343d9-b0a6-48a9-be6b-8a156d6ba570 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:05.057 143781 DEBUG oslo_concurrency.lockutils [req-526343d9-b0a6-48a9-be6b-8a156d6ba570 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:05.057 143787 DEBUG oslo_concurrency.lockutils [req-526343d9-b0a6-48a9-be6b-8a156d6ba570 ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:05.057 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:05.057 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:05.058 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:05.058 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:05.058 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:05.058 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:05.059 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:05.059 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:05.059 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:05.059 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:05.059 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:05.059 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:05.082 143787 DEBUG oslo_service.periodic_task [req-9838b897-1ac6-471a-ad4f-8170e0659193 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:57:05.087 143787 DEBUG oslo_concurrency.lockutils [req-e4b0d188-ccfa-412f-b18b-d1536834dcec - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:05.087 143787 DEBUG oslo_concurrency.lockutils [req-e4b0d188-ccfa-412f-b18b-d1536834dcec - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:05.521 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 92d5f85542854248a6a14351e080a92c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:57:05.521 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 92d5f85542854248a6a14351e080a92c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:57:05.521 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 92d5f85542854248a6a14351e080a92c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:57:05.521 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 92d5f85542854248a6a14351e080a92c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:57:05.521 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:05.521 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:05.521 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:05.521 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:05.523 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 92d5f85542854248a6a14351e080a92c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:57:05.523 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 92d5f85542854248a6a14351e080a92c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:57:05.523 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 92d5f85542854248a6a14351e080a92c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:57:05.523 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 92d5f85542854248a6a14351e080a92c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:57:05.523 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:05.523 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:05.523 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:05.523 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:05.523 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:05.523 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:05.523 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:05.523 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:05.524 143780 DEBUG oslo_concurrency.lockutils [req-e1e6e5c0-d146-400c-9bff-ae428378e87f ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:05.524 143781 DEBUG oslo_concurrency.lockutils [req-e1e6e5c0-d146-400c-9bff-ae428378e87f ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:05.524 143787 DEBUG oslo_concurrency.lockutils [req-e1e6e5c0-d146-400c-9bff-ae428378e87f ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:05.524 143779 DEBUG oslo_concurrency.lockutils [req-e1e6e5c0-d146-400c-9bff-ae428378e87f ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:05.524 143780 DEBUG oslo_concurrency.lockutils [req-e1e6e5c0-d146-400c-9bff-ae428378e87f ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:05.525 143787 DEBUG oslo_concurrency.lockutils [req-e1e6e5c0-d146-400c-9bff-ae428378e87f ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:05.525 143781 DEBUG oslo_concurrency.lockutils [req-e1e6e5c0-d146-400c-9bff-ae428378e87f ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:05.525 143779 DEBUG oslo_concurrency.lockutils [req-e1e6e5c0-d146-400c-9bff-ae428378e87f ae86820f69864d4180e67c6acb6ccfe5 98217cb9e91140b6b8528e7ee5b554c7 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:05.525 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:05.525 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:05.525 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:05.525 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:05.525 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:05.525 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:05.525 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:05.525 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:05.525 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:05.525 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:05.525 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:05.525 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:06.526 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:06.526 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:06.526 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:06.526 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:06.527 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:06.527 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:06.527 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:06.527 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:06.527 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:06.527 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:06.528 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:06.528 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:08.529 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:08.529 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:08.529 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:08.529 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:08.530 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:08.529 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:08.530 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:08.530 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:08.530 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:08.530 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:08.530 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:08.530 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:12.531 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:12.531 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:12.532 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:12.533 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:12.534 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:12.534 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:12.534 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:12.534 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:12.535 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:12.535 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:12.535 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:12.535 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:15.100 143780 DEBUG oslo_service.periodic_task [req-ec0f715c-ce41-4977-8a20-6646d92f1dab - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:57:15.103 143780 DEBUG oslo_concurrency.lockutils [req-c3360b13-ef55-4efc-85b1-f853b068217b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:15.103 143780 DEBUG oslo_concurrency.lockutils [req-c3360b13-ef55-4efc-85b1-f853b068217b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:16.104 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 262e982bb8414a57a16a54046d45fd5b reply to reply_212ff63d4a534c92a13efa0416baac2d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:57:16.104 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:16.104 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: b1c8b15e07ae4c30a2e7318e2f63a250 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:57:16.104 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:16.104 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:16.106 143781 DEBUG nova.scheduler.manager [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['1da5fc77-715a-4da6-bcc7-c21b635951b4'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:57:16.108 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:16.108 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:16.108 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:16.110 143781 DEBUG nova.scheduler.request_filter [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:57:16.111 143781 DEBUG nova.scheduler.request_filter [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:57:16.111 143781 DEBUG nova.scheduler.request_filter [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:57:16.111 143781 DEBUG nova.scheduler.request_filter [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:57:16.111 143781 DEBUG nova.scheduler.request_filter [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:57:16.115 143781 DEBUG oslo_concurrency.lockutils [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:16.115 143781 DEBUG oslo_concurrency.lockutils [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:16.518 143781 DEBUG oslo_messaging._drivers.amqpdriver [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: ba707904ec2648e89a087fc9812d7301 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:57:16.520 143781 DEBUG oslo_concurrency.lockutils [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:16.520 143781 DEBUG oslo_concurrency.lockutils [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:16.528 143781 DEBUG oslo_concurrency.lockutils [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:16.529 143781 DEBUG nova.scheduler.host_manager [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=23,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0',io_workload='0',num_instances='0',num_os_type_None='0',num_proj_98217cb9e91140b6b8528e7ee5b554c7='0',num_task_None='0',num_vm_active='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:57:05Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:57:16.530 143781 DEBUG nova.scheduler.host_manager [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:57:16.530 143781 DEBUG nova.scheduler.host_manager [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 468, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 57, 11, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 57, 11, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:57:16.530 143781 DEBUG nova.scheduler.host_manager [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:57:16.530 143781 DEBUG oslo_concurrency.lockutils [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:16.531 143781 INFO nova.scheduler.host_manager [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:57:16.531 143781 DEBUG nova.scheduler.manager [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 23552MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:57:16.531 143781 DEBUG nova.scheduler.manager [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 23552MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:57:16.531 143781 DEBUG nova.scheduler.utils [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance 1da5fc77-715a-4da6-bcc7-c21b635951b4 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:57:16.614 143781 DEBUG nova.scheduler.manager [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: 1da5fc77-715a-4da6-bcc7-c21b635951b4] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 23552MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:57:16.614 143781 DEBUG oslo_concurrency.lockutils [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:16.615 143781 DEBUG oslo_concurrency.lockutils [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:16.616 143781 DEBUG oslo_messaging._drivers.amqpdriver [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: c62dde3bedf645dcb2cd8f8e0ce6d504 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:57:16.617 143781 DEBUG oslo_messaging._drivers.amqpdriver [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: 262e982bb8414a57a16a54046d45fd5b reply queue: reply_212ff63d4a534c92a13efa0416baac2d time elapsed: 0.5134811610005272s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:57:17.109 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:17.109 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:17.109 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:19.111 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:19.111 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:19.112 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:19.610 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d23ac5af52f94e409be8e8dda8f2eeef __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:57:19.610 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d23ac5af52f94e409be8e8dda8f2eeef __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:57:19.610 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d23ac5af52f94e409be8e8dda8f2eeef __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:57:19.611 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:19.611 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:19.611 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:19.611 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d23ac5af52f94e409be8e8dda8f2eeef __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:57:19.611 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d23ac5af52f94e409be8e8dda8f2eeef poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:57:19.611 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d23ac5af52f94e409be8e8dda8f2eeef poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:57:19.611 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d23ac5af52f94e409be8e8dda8f2eeef poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:57:19.611 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:19.611 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:19.611 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:19.611 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:19.611 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:19.611 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:19.611 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:19.611 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d23ac5af52f94e409be8e8dda8f2eeef poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:57:19.612 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:19.612 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:19.613 143779 DEBUG oslo_concurrency.lockutils [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:19.613 143787 DEBUG oslo_concurrency.lockutils [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:19.613 143780 DEBUG oslo_concurrency.lockutils [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:19.613 143779 DEBUG oslo_concurrency.lockutils [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:19.613 143787 DEBUG oslo_concurrency.lockutils [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:19.614 143780 DEBUG oslo_concurrency.lockutils [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:19.614 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:19.614 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:19.614 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:19.614 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:19.614 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:19.614 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:19.614 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:19.614 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:19.614 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:19.614 143781 DEBUG oslo_concurrency.lockutils [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:19.615 143781 DEBUG oslo_concurrency.lockutils [req-804efe5e-cfde-47e2-b3da-598b16ab8d02 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:19.615 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:19.615 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:19.615 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:20.615 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:20.615 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:20.615 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:20.615 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:20.615 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:20.615 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:20.615 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:20.616 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:20.616 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:20.617 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:20.617 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:20.617 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:22.066 143779 DEBUG oslo_service.periodic_task [req-9b53b76c-bf74-4d3f-ba96-0061eb3ad5f4 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:57:22.070 143779 DEBUG oslo_concurrency.lockutils [req-af94365c-e349-4119-8fc9-b735d607e8bb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:22.071 143779 DEBUG oslo_concurrency.lockutils [req-af94365c-e349-4119-8fc9-b735d607e8bb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:22.616 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:22.617 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:22.617 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:22.618 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:22.618 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:22.618 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:22.618 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:22.618 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:22.618 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:22.619 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:22.619 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:22.619 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:26.620 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:26.620 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:26.620 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:26.621 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:26.621 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:26.621 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:26.621 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:26.622 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:26.622 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:26.622 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:26.623 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:26.623 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:27.079 143781 DEBUG oslo_service.periodic_task [req-fbcbf975-8e3a-49e3-b8a8-15a036dc6da2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:57:27.083 143781 DEBUG oslo_concurrency.lockutils [req-b84cbce3-75ce-4a5c-8846-aa07607b60ce - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:27.084 143781 DEBUG oslo_concurrency.lockutils [req-b84cbce3-75ce-4a5c-8846-aa07607b60ce - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:34.622 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:34.623 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:34.623 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:34.625 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:34.625 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:34.626 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:34.626 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:34.626 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:34.626 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:34.627 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:34.628 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:34.628 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:35.093 143787 DEBUG oslo_service.periodic_task [req-e4b0d188-ccfa-412f-b18b-d1536834dcec - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:57:35.097 143787 DEBUG oslo_concurrency.lockutils [req-83bed8d6-03db-4d15-b0b5-60d8c65dcc5e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:35.097 143787 DEBUG oslo_concurrency.lockutils [req-83bed8d6-03db-4d15-b0b5-60d8c65dcc5e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:36.118 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 7fbd3b0638ff4acab8229f18a22761f0 reply to reply_212ff63d4a534c92a13efa0416baac2d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:57:36.119 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:36.119 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 0415fcc47b8e4b53bf893934c54e228d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:57:36.119 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:36.119 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:36.121 143780 DEBUG nova.scheduler.manager [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['710dbf07-34d1-4129-a604-ed170006532f'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:57:36.123 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:36.123 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:36.123 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:36.127 143780 DEBUG nova.scheduler.request_filter [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:57:36.127 143780 DEBUG nova.scheduler.request_filter [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:57:36.127 143780 DEBUG nova.scheduler.request_filter [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:57:36.127 143780 DEBUG nova.scheduler.request_filter [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:57:36.128 143780 DEBUG nova.scheduler.request_filter [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:57:36.132 143780 DEBUG oslo_concurrency.lockutils [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:36.133 143780 DEBUG oslo_concurrency.lockutils [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:36.534 143780 DEBUG oslo_messaging._drivers.amqpdriver [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 78f4bb3d230844088d6c19d94a8bca55 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:57:36.536 143780 DEBUG oslo_concurrency.lockutils [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:36.536 143780 DEBUG oslo_concurrency.lockutils [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:36.544 143780 DEBUG oslo_concurrency.lockutils [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:36.544 143780 DEBUG nova.scheduler.host_manager [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=25,free_disk_gb=76,free_ram_mb=30363,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=1,mapped=1,memory_mb=31899,memory_mb_used=1536,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=1,service_id=None,stats={failed_builds='0',io_workload='0',num_instances='1',num_os_type_None='1',num_proj_01e1450d36d44b29baf4a0a22992292d='1',num_task_None='1',num_vm_active='1'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:57:34Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=1) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:57:36.545 143780 DEBUG nova.scheduler.host_manager [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:57:36.545 143780 DEBUG nova.scheduler.host_manager [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 470, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 57, 31, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 57, 31, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:57:36.545 143780 DEBUG nova.scheduler.host_manager [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: ['1da5fc77-715a-4da6-bcc7-c21b635951b4'] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:57:36.545 143780 DEBUG oslo_concurrency.lockutils [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:36.546 143780 INFO nova.scheduler.host_manager [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:57:36.546 143780 DEBUG nova.scheduler.manager [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 25600MB io_ops: 0 instances: 1]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:57:36.546 143780 DEBUG nova.scheduler.manager [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 25600MB io_ops: 0 instances: 1, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:57:36.546 143780 DEBUG nova.scheduler.utils [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance 710dbf07-34d1-4129-a604-ed170006532f claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:57:36.607 143780 DEBUG nova.scheduler.manager [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: 710dbf07-34d1-4129-a604-ed170006532f] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 25600MB io_ops: 0 instances: 1 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:57:36.608 143780 DEBUG oslo_concurrency.lockutils [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:36.608 143780 DEBUG oslo_concurrency.lockutils [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:36.609 143780 DEBUG oslo_messaging._drivers.amqpdriver [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 2260defa0fee47dc9c5ae8e4ff00135a NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:57:36.611 143780 DEBUG oslo_messaging._drivers.amqpdriver [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: 7fbd3b0638ff4acab8229f18a22761f0 reply queue: reply_212ff63d4a534c92a13efa0416baac2d time elapsed: 0.4925220780005475s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:57:37.125 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:37.125 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:37.126 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:39.127 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:39.127 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:39.127 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:39.355 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d803e6f9a051487cb0a2216fe54c2bce __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:57:39.356 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d803e6f9a051487cb0a2216fe54c2bce __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:57:39.356 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d803e6f9a051487cb0a2216fe54c2bce __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:57:39.356 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:39.356 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d803e6f9a051487cb0a2216fe54c2bce poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:57:39.356 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:39.356 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d803e6f9a051487cb0a2216fe54c2bce poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:57:39.356 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:39.356 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:39.356 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:39.356 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:39.356 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d803e6f9a051487cb0a2216fe54c2bce poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:57:39.356 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:39.357 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:39.357 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:39.357 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d803e6f9a051487cb0a2216fe54c2bce __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:57:39.357 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:39.357 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d803e6f9a051487cb0a2216fe54c2bce poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:57:39.358 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:39.358 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:39.359 143780 DEBUG oslo_concurrency.lockutils [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:39.358 143781 DEBUG oslo_concurrency.lockutils [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:39.359 143780 DEBUG oslo_concurrency.lockutils [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:39.359 143781 DEBUG oslo_concurrency.lockutils [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:39.359 143787 DEBUG oslo_concurrency.lockutils [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:39.359 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:39.359 143787 DEBUG oslo_concurrency.lockutils [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:39.359 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:39.359 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:39.359 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:39.359 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:39.359 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:39.359 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:39.360 143779 DEBUG oslo_concurrency.lockutils [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:39.360 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:39.360 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:39.360 143779 DEBUG oslo_concurrency.lockutils [req-df0c1f9a-a585-4729-b57d-40e3f99595db a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:39.360 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:39.360 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:39.361 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:40.360 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:40.361 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:40.361 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:40.361 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:40.361 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:40.361 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:40.361 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:40.361 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:40.362 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:40.362 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:40.362 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:40.362 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:42.363 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:42.363 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:42.363 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:42.363 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:42.363 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:42.364 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:42.365 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:42.365 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:42.365 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:42.365 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:42.365 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:42.365 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:46.044 143780 DEBUG oslo_service.periodic_task [req-c3360b13-ef55-4efc-85b1-f853b068217b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:57:46.048 143780 DEBUG oslo_concurrency.lockutils [req-6897534f-b700-4b74-bb66-22ad03f95020 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:46.048 143780 DEBUG oslo_concurrency.lockutils [req-6897534f-b700-4b74-bb66-22ad03f95020 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:46.365 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:46.365 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:46.365 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:46.366 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:46.367 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:46.367 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:46.369 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:46.369 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:46.369 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:46.370 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:46.370 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:46.370 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:52.078 143779 DEBUG oslo_service.periodic_task [req-af94365c-e349-4119-8fc9-b735d607e8bb - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:57:52.082 143779 DEBUG oslo_concurrency.lockutils [req-d7365ba9-c6f9-48d9-9037-13a8c1886768 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:52.082 143779 DEBUG oslo_concurrency.lockutils [req-d7365ba9-c6f9-48d9-9037-13a8c1886768 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:57:54.369 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:54.370 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:54.369 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:54.370 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:54.370 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:54.370 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:54.371 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:54.371 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:54.372 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:54.373 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:57:54.374 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:57:54.374 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:57:57.088 143781 DEBUG oslo_service.periodic_task [req-b84cbce3-75ce-4a5c-8846-aa07607b60ce - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:57:57.092 143781 DEBUG oslo_concurrency.lockutils [req-584f74f9-3156-4058-a659-1d0f5fa8546b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:57:57.093 143781 DEBUG oslo_concurrency.lockutils [req-584f74f9-3156-4058-a659-1d0f5fa8546b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:05.102 143787 DEBUG oslo_service.periodic_task [req-83bed8d6-03db-4d15-b0b5-60d8c65dcc5e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:58:05.106 143787 DEBUG oslo_concurrency.lockutils [req-b4c550bd-ba52-4de1-8e1d-8f66a907538d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:05.106 143787 DEBUG oslo_concurrency.lockutils [req-b4c550bd-ba52-4de1-8e1d-8f66a907538d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:06.986 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 53906e85c20e49668a01bd33245db656 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:06.986 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 53906e85c20e49668a01bd33245db656 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:06.986 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 53906e85c20e49668a01bd33245db656 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:06.986 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 53906e85c20e49668a01bd33245db656 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:06.986 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:06.986 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:06.986 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:06.986 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:06.986 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 53906e85c20e49668a01bd33245db656 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:06.987 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 53906e85c20e49668a01bd33245db656 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:06.987 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 53906e85c20e49668a01bd33245db656 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:06.987 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 53906e85c20e49668a01bd33245db656 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:06.987 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:06.987 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:06.987 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:06.987 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:06.987 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:06.987 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:06.987 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:06.987 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:06.987 143781 DEBUG oslo_concurrency.lockutils [req-b699c2af-70a6-4352-8547-f56a08f2c3c7 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:06.987 143779 DEBUG oslo_concurrency.lockutils [req-b699c2af-70a6-4352-8547-f56a08f2c3c7 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:06.987 143780 DEBUG oslo_concurrency.lockutils [req-b699c2af-70a6-4352-8547-f56a08f2c3c7 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:06.988 143787 DEBUG oslo_concurrency.lockutils [req-b699c2af-70a6-4352-8547-f56a08f2c3c7 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:06.988 143781 DEBUG oslo_concurrency.lockutils [req-b699c2af-70a6-4352-8547-f56a08f2c3c7 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:06.988 143779 DEBUG oslo_concurrency.lockutils [req-b699c2af-70a6-4352-8547-f56a08f2c3c7 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:06.988 143780 DEBUG oslo_concurrency.lockutils [req-b699c2af-70a6-4352-8547-f56a08f2c3c7 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:06.988 143787 DEBUG oslo_concurrency.lockutils [req-b699c2af-70a6-4352-8547-f56a08f2c3c7 a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:06.989 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:06.989 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:06.989 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:06.989 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:06.989 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:06.989 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:06.989 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:06.989 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:06.989 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:06.989 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:06.990 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:06.990 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:07.032 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 5df20c2d0d964ed78921f130d8c60280 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:07.032 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 5df20c2d0d964ed78921f130d8c60280 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:07.032 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 5df20c2d0d964ed78921f130d8c60280 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:07.032 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 5df20c2d0d964ed78921f130d8c60280 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:07.032 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:07.032 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:07.032 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:07.032 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5df20c2d0d964ed78921f130d8c60280 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:07.032 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5df20c2d0d964ed78921f130d8c60280 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:07.032 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5df20c2d0d964ed78921f130d8c60280 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:07.033 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:07.033 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:07.033 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:07.033 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:07.033 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:07.033 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:07.033 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 5df20c2d0d964ed78921f130d8c60280 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:07.033 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:07.033 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:07.033 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:07.033 143781 DEBUG oslo_concurrency.lockutils [req-a5e36597-43cd-44e8-8faa-731b66d75aba a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:07.033 143787 DEBUG oslo_concurrency.lockutils [req-a5e36597-43cd-44e8-8faa-731b66d75aba a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:07.033 143780 DEBUG oslo_concurrency.lockutils [req-a5e36597-43cd-44e8-8faa-731b66d75aba a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:07.033 143781 DEBUG oslo_concurrency.lockutils [req-a5e36597-43cd-44e8-8faa-731b66d75aba a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:07.034 143787 DEBUG oslo_concurrency.lockutils [req-a5e36597-43cd-44e8-8faa-731b66d75aba a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:07.034 143780 DEBUG oslo_concurrency.lockutils [req-a5e36597-43cd-44e8-8faa-731b66d75aba a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:07.034 143779 DEBUG oslo_concurrency.lockutils [req-a5e36597-43cd-44e8-8faa-731b66d75aba a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:07.035 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:07.035 143779 DEBUG oslo_concurrency.lockutils [req-a5e36597-43cd-44e8-8faa-731b66d75aba a5d56cff887e4417b2766d91f1883c20 01e1450d36d44b29baf4a0a22992292d - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:07.035 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:07.035 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:07.035 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:07.035 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:07.035 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:07.035 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:07.035 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:07.035 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:07.035 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:07.035 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:07.035 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:08.036 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:08.036 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:08.036 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:08.036 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:08.036 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:08.036 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:08.036 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:08.036 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:08.036 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:08.037 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:08.037 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:08.037 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:10.037 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:10.038 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:10.038 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:10.038 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:10.038 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:10.038 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:10.038 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:10.038 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:10.039 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:10.039 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:10.039 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:10.039 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:14.041 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:14.042 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:14.041 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:14.042 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:14.042 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:14.042 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:14.043 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:14.044 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:14.044 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:14.044 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:14.044 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:14.044 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:14.969 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: ebeadf2294794e5e956a212210db07a4 reply to reply_c93f245d97184cab8c1af1830c6ac100 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:58:14.969 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:14.969 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: a3552c0255674f719e071b59f5ad86f5 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:14.969 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:14.969 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:14.971 143779 DEBUG nova.scheduler.manager [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['82564acd-3778-4053-9744-bd7be657323a'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:58:14.972 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:14.972 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:14.973 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:14.976 143779 DEBUG nova.scheduler.request_filter [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:58:14.977 143779 DEBUG nova.scheduler.request_filter [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:58:14.977 143779 DEBUG nova.scheduler.request_filter [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:58:14.977 143779 DEBUG nova.scheduler.request_filter [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:58:14.977 143779 DEBUG nova.scheduler.request_filter [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:58:14.981 143779 DEBUG oslo_concurrency.lockutils [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:14.982 143779 DEBUG oslo_concurrency.lockutils [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:15.110 143779 DEBUG oslo_messaging._drivers.amqpdriver [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 23c20e34e7d6421489a1a79cf4066b29 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:58:15.111 143779 DEBUG oslo_concurrency.lockutils [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:15.112 143779 DEBUG oslo_concurrency.lockutils [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:15.120 143779 DEBUG oslo_concurrency.lockutils [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:15.120 143779 DEBUG nova.scheduler.host_manager [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=25,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0',io_workload='0',num_instances='0',num_os_type_None='0',num_proj_01e1450d36d44b29baf4a0a22992292d='0',num_task_None='0',num_vm_active='0',num_vm_building='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:58:07Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:58:15.121 143779 DEBUG nova.scheduler.host_manager [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:58:15.121 143779 DEBUG nova.scheduler.host_manager [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 474, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 58, 11, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 58, 11, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:58:15.121 143779 DEBUG nova.scheduler.host_manager [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:58:15.122 143779 DEBUG oslo_concurrency.lockutils [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:15.122 143779 INFO nova.scheduler.host_manager [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:58:15.122 143779 DEBUG nova.scheduler.manager [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 25600MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:58:15.122 143779 DEBUG nova.scheduler.manager [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 25600MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:58:15.122 143779 DEBUG nova.scheduler.utils [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance 82564acd-3778-4053-9744-bd7be657323a claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:58:15.199 143779 DEBUG nova.scheduler.manager [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: 82564acd-3778-4053-9744-bd7be657323a] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 25600MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:58:15.200 143779 DEBUG oslo_concurrency.lockutils [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:15.200 143779 DEBUG oslo_concurrency.lockutils [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:15.202 143779 DEBUG oslo_messaging._drivers.amqpdriver [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: f5e5084b6b9248ad90b15f342d9d2406 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:58:15.203 143779 DEBUG oslo_messaging._drivers.amqpdriver [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: ebeadf2294794e5e956a212210db07a4 reply queue: reply_c93f245d97184cab8c1af1830c6ac100 time elapsed: 0.23432686399974045s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:58:15.974 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:15.975 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:15.975 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:16.080 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 71d6986163724ac99a704980adcd737c reply to reply_c93f245d97184cab8c1af1830c6ac100 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:58:16.080 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:16.081 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 13dafa1a3a7445a3891a7a4e0b60353f poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:16.081 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:16.081 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:16.083 143787 DEBUG nova.scheduler.manager [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['f6888db5-9042-40b4-9cd2-96d528ccaae2'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:58:16.084 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:16.085 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:16.085 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:16.092 143787 DEBUG nova.scheduler.request_filter [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:58:16.092 143787 DEBUG nova.scheduler.request_filter [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:58:16.093 143787 DEBUG nova.scheduler.request_filter [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:58:16.093 143787 DEBUG nova.scheduler.request_filter [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:58:16.093 143787 DEBUG nova.scheduler.request_filter [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:58:16.097 143787 DEBUG oslo_concurrency.lockutils [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:16.097 143787 DEBUG oslo_concurrency.lockutils [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:16.178 143787 DEBUG oslo_messaging._drivers.amqpdriver [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: db03b60aa5594b008e3a94018302771e NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:58:16.180 143787 DEBUG oslo_concurrency.lockutils [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:16.180 143787 DEBUG oslo_concurrency.lockutils [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:16.190 143787 DEBUG oslo_concurrency.lockutils [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:16.190 143787 DEBUG nova.scheduler.host_manager [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=25,free_disk_gb=76,free_ram_mb=30363,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=1,mapped=1,memory_mb=31899,memory_mb_used=1536,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=1,service_id=None,stats={failed_builds='0',io_workload='1',num_instances='1',num_os_type_None='1',num_proj_01e1450d36d44b29baf4a0a22992292d='0',num_proj_1bd73ee4907245b7a141592834f2a269='1',num_task_None='1',num_vm_active='0',num_vm_building='1'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:58:16Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=1) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:58:16.191 143787 DEBUG nova.scheduler.host_manager [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:58:16.192 143787 DEBUG nova.scheduler.host_manager [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 474, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 58, 11, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 58, 11, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:58:16.192 143787 DEBUG nova.scheduler.host_manager [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:58:16.192 143787 DEBUG oslo_concurrency.lockutils [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:16.192 143787 INFO nova.scheduler.host_manager [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:58:16.193 143787 DEBUG nova.scheduler.manager [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 25600MB io_ops: 1 instances: 1]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:58:16.193 143787 DEBUG nova.scheduler.manager [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 25600MB io_ops: 1 instances: 1, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:58:16.193 143787 DEBUG nova.scheduler.utils [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance f6888db5-9042-40b4-9cd2-96d528ccaae2 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:58:16.271 143787 DEBUG nova.scheduler.manager [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: f6888db5-9042-40b4-9cd2-96d528ccaae2] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 25600MB io_ops: 1 instances: 1 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:58:16.271 143787 DEBUG oslo_concurrency.lockutils [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:16.272 143787 DEBUG oslo_concurrency.lockutils [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:16.273 143787 DEBUG oslo_messaging._drivers.amqpdriver [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 8df455a01b3d4a14b45d99102aa9c64a NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:58:16.275 143787 DEBUG oslo_messaging._drivers.amqpdriver [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: 71d6986163724ac99a704980adcd737c reply queue: reply_c93f245d97184cab8c1af1830c6ac100 time elapsed: 0.19413041999996494s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:58:17.044 143780 DEBUG oslo_service.periodic_task [req-6897534f-b700-4b74-bb66-22ad03f95020 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:58:17.048 143780 DEBUG oslo_concurrency.lockutils [req-d08eaa2b-2de2-4573-a0ed-fc01029108bb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:17.048 143780 DEBUG oslo_concurrency.lockutils [req-d08eaa2b-2de2-4573-a0ed-fc01029108bb - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:17.086 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:17.087 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:17.087 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:17.976 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:17.976 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:17.976 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:18.260 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 2c49bf155e7c4c59baeea00e95bcb36b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:18.260 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 2c49bf155e7c4c59baeea00e95bcb36b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:18.260 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 2c49bf155e7c4c59baeea00e95bcb36b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:18.260 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 2c49bf155e7c4c59baeea00e95bcb36b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:18.260 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:18.260 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:18.261 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2c49bf155e7c4c59baeea00e95bcb36b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:18.260 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:18.261 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2c49bf155e7c4c59baeea00e95bcb36b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:18.260 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:18.261 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2c49bf155e7c4c59baeea00e95bcb36b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:18.261 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2c49bf155e7c4c59baeea00e95bcb36b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:18.261 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:18.261 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:18.261 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:18.261 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:18.261 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:18.261 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:18.261 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:18.261 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:18.263 143781 DEBUG oslo_concurrency.lockutils [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:18.263 143781 DEBUG oslo_concurrency.lockutils [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:18.263 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:18.263 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:18.263 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:18.263 143780 DEBUG oslo_concurrency.lockutils [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:18.264 143780 DEBUG oslo_concurrency.lockutils [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:18.264 143787 DEBUG oslo_concurrency.lockutils [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:18.264 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:18.265 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:18.265 143787 DEBUG oslo_concurrency.lockutils [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:18.265 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:18.265 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:18.265 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:18.265 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:18.271 143779 DEBUG oslo_concurrency.lockutils [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:18.272 143779 DEBUG oslo_concurrency.lockutils [req-053409bd-c6d9-4726-96fa-8e08da1d3732 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:18.272 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:18.272 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:18.272 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:19.142 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 2eeab5cbd25847c88b59011e15f14c3b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:19.143 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:19.143 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2eeab5cbd25847c88b59011e15f14c3b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:19.143 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 2eeab5cbd25847c88b59011e15f14c3b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:19.143 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 2eeab5cbd25847c88b59011e15f14c3b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:19.144 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 2eeab5cbd25847c88b59011e15f14c3b __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:19.144 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:19.144 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:19.144 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2eeab5cbd25847c88b59011e15f14c3b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:19.144 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2eeab5cbd25847c88b59011e15f14c3b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:19.144 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:19.144 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 2eeab5cbd25847c88b59011e15f14c3b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:19.144 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:19.144 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:19.145 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:19.145 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:19.145 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:19.145 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:19.146 143781 DEBUG oslo_concurrency.lockutils [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:19.147 143787 DEBUG oslo_concurrency.lockutils [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:19.147 143781 DEBUG oslo_concurrency.lockutils [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:19.147 143787 DEBUG oslo_concurrency.lockutils [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:19.147 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:19.147 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:19.147 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:19.147 143780 DEBUG oslo_concurrency.lockutils [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:19.147 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:19.143 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:19.148 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:19.147 143780 DEBUG oslo_concurrency.lockutils [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:19.148 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:19.148 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:19.148 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:19.148 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:19.148 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:19.150 143779 DEBUG oslo_concurrency.lockutils [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:19.151 143779 DEBUG oslo_concurrency.lockutils [req-126a9ed7-1744-4d28-9636-83ea77e95ccb bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:19.151 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:19.151 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:19.151 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:20.149 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:20.149 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:20.149 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:20.150 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:20.150 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:20.150 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:20.150 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:20.150 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:20.151 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:20.153 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:20.153 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:20.153 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:22.087 143779 DEBUG oslo_service.periodic_task [req-d7365ba9-c6f9-48d9-9037-13a8c1886768 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:58:22.091 143779 DEBUG oslo_concurrency.lockutils [req-3e644524-950a-44df-a51a-6f316cc91a90 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:22.091 143779 DEBUG oslo_concurrency.lockutils [req-3e644524-950a-44df-a51a-6f316cc91a90 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:22.151 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:22.151 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:22.151 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:22.151 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:22.152 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:22.152 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:22.152 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:22.153 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:22.153 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:22.154 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:22.154 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:22.154 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:23.626 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d9cd98a07366441caef82b1946e8b913 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:23.626 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d9cd98a07366441caef82b1946e8b913 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:23.626 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d9cd98a07366441caef82b1946e8b913 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:23.626 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: d9cd98a07366441caef82b1946e8b913 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:23.626 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:23.626 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:23.626 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d9cd98a07366441caef82b1946e8b913 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:23.626 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d9cd98a07366441caef82b1946e8b913 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:23.626 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:23.627 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:23.627 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:23.627 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d9cd98a07366441caef82b1946e8b913 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:23.627 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:23.627 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:23.627 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: d9cd98a07366441caef82b1946e8b913 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:23.627 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:23.627 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:23.627 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:23.627 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:23.627 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:23.627 143781 DEBUG oslo_concurrency.lockutils [req-db7ec8b7-a4a9-44e9-b94f-6d98525aee3f - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:23.627 143779 DEBUG oslo_concurrency.lockutils [req-db7ec8b7-a4a9-44e9-b94f-6d98525aee3f - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:23.628 143781 DEBUG nova.scheduler.host_manager [req-db7ec8b7-a4a9-44e9-b94f-6d98525aee3f - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:58:23.628 143779 DEBUG nova.scheduler.host_manager [req-db7ec8b7-a4a9-44e9-b94f-6d98525aee3f - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:58:23.628 143780 DEBUG oslo_concurrency.lockutils [req-db7ec8b7-a4a9-44e9-b94f-6d98525aee3f - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:23.628 143781 DEBUG oslo_concurrency.lockutils [req-db7ec8b7-a4a9-44e9-b94f-6d98525aee3f - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:23.628 143779 DEBUG oslo_concurrency.lockutils [req-db7ec8b7-a4a9-44e9-b94f-6d98525aee3f - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:23.628 143787 DEBUG oslo_concurrency.lockutils [req-db7ec8b7-a4a9-44e9-b94f-6d98525aee3f - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:23.628 143780 DEBUG nova.scheduler.host_manager [req-db7ec8b7-a4a9-44e9-b94f-6d98525aee3f - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:58:23.628 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:23.628 143787 DEBUG nova.scheduler.host_manager [req-db7ec8b7-a4a9-44e9-b94f-6d98525aee3f - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 01:58:23.628 143780 DEBUG oslo_concurrency.lockutils [req-db7ec8b7-a4a9-44e9-b94f-6d98525aee3f - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:23.628 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:23.628 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:23.628 143787 DEBUG oslo_concurrency.lockutils [req-db7ec8b7-a4a9-44e9-b94f-6d98525aee3f - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:23.628 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:23.629 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:23.629 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:23.629 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:23.629 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:23.629 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:23.629 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:23.629 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:23.630 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:24.629 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:24.630 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:24.630 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:24.630 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:24.630 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:24.630 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:24.630 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:24.630 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:24.631 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:24.631 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:24.631 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:24.631 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:26.631 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:26.632 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:26.632 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:26.632 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:26.632 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:26.632 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:26.632 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:26.633 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:26.633 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:26.633 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:26.633 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:26.633 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:27.098 143781 DEBUG oslo_service.periodic_task [req-584f74f9-3156-4058-a659-1d0f5fa8546b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:58:27.107 143781 DEBUG oslo_concurrency.lockutils [req-e6599d2d-def8-4407-8d29-24d14e0b43ca - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:27.108 143781 DEBUG oslo_concurrency.lockutils [req-e6599d2d-def8-4407-8d29-24d14e0b43ca - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:30.633 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:30.634 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:30.634 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:30.637 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:30.638 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:30.638 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:30.638 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:30.638 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:30.638 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:30.638 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:30.638 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:30.638 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:35.113 143787 DEBUG oslo_service.periodic_task [req-b4c550bd-ba52-4de1-8e1d-8f66a907538d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:58:35.116 143787 DEBUG oslo_concurrency.lockutils [req-a672c82a-ca50-4b25-97cd-fbf61ba0bd7a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:35.117 143787 DEBUG oslo_concurrency.lockutils [req-a672c82a-ca50-4b25-97cd-fbf61ba0bd7a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:36.002 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 3754b77bc8634839b97e7b64bdfe266c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:36.002 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 3754b77bc8634839b97e7b64bdfe266c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:36.002 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 3754b77bc8634839b97e7b64bdfe266c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:36.002 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 3754b77bc8634839b97e7b64bdfe266c __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:36.002 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:36.002 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:36.002 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:36.002 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3754b77bc8634839b97e7b64bdfe266c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:36.002 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3754b77bc8634839b97e7b64bdfe266c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:36.002 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3754b77bc8634839b97e7b64bdfe266c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:36.002 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:36.002 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:36.002 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:36.003 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:36.003 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:36.003 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:36.002 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:36.003 143780 DEBUG oslo_concurrency.lockutils [req-3de1cdc7-7ab2-40d4-975f-c59ca075b202 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:36.003 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3754b77bc8634839b97e7b64bdfe266c poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:36.003 143781 DEBUG oslo_concurrency.lockutils [req-3de1cdc7-7ab2-40d4-975f-c59ca075b202 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:36.003 143780 DEBUG oslo_concurrency.lockutils [req-3de1cdc7-7ab2-40d4-975f-c59ca075b202 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:36.003 143781 DEBUG oslo_concurrency.lockutils [req-3de1cdc7-7ab2-40d4-975f-c59ca075b202 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:36.003 143787 DEBUG oslo_concurrency.lockutils [req-3de1cdc7-7ab2-40d4-975f-c59ca075b202 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:36.004 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:36.004 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:36.004 143787 DEBUG oslo_concurrency.lockutils [req-3de1cdc7-7ab2-40d4-975f-c59ca075b202 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:36.004 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:36.004 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:36.004 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:36.004 143779 DEBUG oslo_concurrency.lockutils [req-3de1cdc7-7ab2-40d4-975f-c59ca075b202 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:36.005 143779 DEBUG oslo_concurrency.lockutils [req-3de1cdc7-7ab2-40d4-975f-c59ca075b202 bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:36.005 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:36.005 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:36.005 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:36.005 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:36.005 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:36.005 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:36.006 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:36.006 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:36.006 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:36.056 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1201f3f28b29467a8f722d43f45bfbc1 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:36.056 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1201f3f28b29467a8f722d43f45bfbc1 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:36.056 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1201f3f28b29467a8f722d43f45bfbc1 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:36.056 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:36.056 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:36.056 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:36.056 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1201f3f28b29467a8f722d43f45bfbc1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:36.056 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1201f3f28b29467a8f722d43f45bfbc1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:36.056 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1201f3f28b29467a8f722d43f45bfbc1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:36.056 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:36.056 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:36.056 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:36.056 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:36.056 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:36.057 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:36.057 143780 DEBUG oslo_concurrency.lockutils [req-e9e2d28e-4ba3-4f9c-866a-aa45a093ef6e bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:36.057 143779 DEBUG oslo_concurrency.lockutils [req-e9e2d28e-4ba3-4f9c-866a-aa45a093ef6e bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:36.057 143780 DEBUG oslo_concurrency.lockutils [req-e9e2d28e-4ba3-4f9c-866a-aa45a093ef6e bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:36.057 143779 DEBUG oslo_concurrency.lockutils [req-e9e2d28e-4ba3-4f9c-866a-aa45a093ef6e bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:36.057 143781 DEBUG oslo_concurrency.lockutils [req-e9e2d28e-4ba3-4f9c-866a-aa45a093ef6e bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:36.058 143781 DEBUG oslo_concurrency.lockutils [req-e9e2d28e-4ba3-4f9c-866a-aa45a093ef6e bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:36.058 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:36.058 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:36.058 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:36.058 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1201f3f28b29467a8f722d43f45bfbc1 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:36.058 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:36.059 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1201f3f28b29467a8f722d43f45bfbc1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:36.059 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:36.059 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:36.059 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:36.059 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:36.059 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:36.059 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:36.059 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:36.059 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:36.059 143787 DEBUG oslo_concurrency.lockutils [req-e9e2d28e-4ba3-4f9c-866a-aa45a093ef6e bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:36.060 143787 DEBUG oslo_concurrency.lockutils [req-e9e2d28e-4ba3-4f9c-866a-aa45a093ef6e bfac21c1272341788c88b558ae1c0c39 1bd73ee4907245b7a141592834f2a269 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:36.061 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:36.061 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:36.061 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:37.059 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:37.060 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:37.060 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:37.060 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:37.060 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:37.060 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:37.060 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:37.060 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:37.060 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:37.063 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:37.063 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:37.063 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:39.060 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:39.061 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:39.061 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:39.062 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:39.062 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:39.062 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:39.062 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:39.062 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:39.062 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:39.065 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:39.065 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:39.065 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:43.062 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:43.062 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:43.062 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:43.065 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:43.065 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:43.066 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:43.066 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:43.066 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:43.066 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:43.070 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:43.070 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:43.070 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:43.647 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 880ed837de9c4b7d8ece2c70faed4f9a reply to reply_54bc7bb5014144dab053290b62b8003d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:58:43.648 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:43.648 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 6c80290f14cf48d492ec8362560517b9 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:43.648 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:43.648 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:43.650 143781 DEBUG nova.scheduler.manager [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['b540e057-7d17-4336-ad14-ac0fcc6b475f'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:58:43.652 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:43.652 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:43.652 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:43.654 143781 DEBUG nova.scheduler.request_filter [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:58:43.655 143781 DEBUG nova.scheduler.request_filter [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:58:43.655 143781 DEBUG nova.scheduler.request_filter [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:58:43.655 143781 DEBUG nova.scheduler.request_filter [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:58:43.655 143781 DEBUG nova.scheduler.request_filter [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:58:43.659 143781 DEBUG oslo_concurrency.lockutils [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:43.660 143781 DEBUG oslo_concurrency.lockutils [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:43.711 143781 DEBUG oslo_messaging._drivers.amqpdriver [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 773dcd6aa195456e804ab924974c1611 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:58:43.712 143781 DEBUG oslo_concurrency.lockutils [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:43.713 143781 DEBUG oslo_concurrency.lockutils [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:43.721 143781 DEBUG oslo_concurrency.lockutils [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:43.721 143781 DEBUG nova.scheduler.host_manager [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=24,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0',io_workload='0',num_instances='0',num_os_type_None='0',num_proj_1bd73ee4907245b7a141592834f2a269='0',num_task_None='0',num_vm_active='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:58:36Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:58:43.722 143781 DEBUG nova.scheduler.host_manager [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:58:43.722 143781 DEBUG nova.scheduler.host_manager [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 477, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 58, 41, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 58, 41, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:58:43.723 143781 DEBUG nova.scheduler.host_manager [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:58:43.723 143781 DEBUG oslo_concurrency.lockutils [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:43.723 143781 INFO nova.scheduler.host_manager [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:58:43.723 143781 DEBUG nova.scheduler.manager [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:58:43.725 143781 DEBUG nova.scheduler.manager [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:58:43.725 143781 DEBUG nova.scheduler.utils [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance b540e057-7d17-4336-ad14-ac0fcc6b475f claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:58:43.815 143781 DEBUG nova.scheduler.manager [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: b540e057-7d17-4336-ad14-ac0fcc6b475f] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:58:43.815 143781 DEBUG oslo_concurrency.lockutils [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:43.815 143781 DEBUG oslo_concurrency.lockutils [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:43.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 579ab7c1e36146a5a51dda36655f2133 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:58:43.819 143781 DEBUG oslo_messaging._drivers.amqpdriver [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: 880ed837de9c4b7d8ece2c70faed4f9a reply queue: reply_54bc7bb5014144dab053290b62b8003d time elapsed: 0.1708823819999452s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:58:44.653 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:44.654 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:44.654 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:46.655 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:46.655 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:46.656 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:46.691 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f31fac18a3ed45719573567afabd51a3 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:46.691 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f31fac18a3ed45719573567afabd51a3 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:46.691 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f31fac18a3ed45719573567afabd51a3 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:46.692 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:46.692 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:46.692 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:46.692 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f31fac18a3ed45719573567afabd51a3 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:46.692 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f31fac18a3ed45719573567afabd51a3 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:46.692 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f31fac18a3ed45719573567afabd51a3 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:46.692 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:46.692 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:46.692 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:46.692 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:46.692 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:46.692 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:46.694 143780 DEBUG oslo_concurrency.lockutils [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:46.694 143787 DEBUG oslo_concurrency.lockutils [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:46.694 143780 DEBUG oslo_concurrency.lockutils [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:46.695 143787 DEBUG oslo_concurrency.lockutils [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:46.695 143779 DEBUG oslo_concurrency.lockutils [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:46.695 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:46.695 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:46.695 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:46.695 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:46.695 143779 DEBUG oslo_concurrency.lockutils [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:46.695 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:46.695 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:46.695 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:46.696 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:46.696 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:46.697 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: f31fac18a3ed45719573567afabd51a3 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:58:46.697 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:46.697 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f31fac18a3ed45719573567afabd51a3 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:58:46.697 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:46.697 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:46.699 143781 DEBUG oslo_concurrency.lockutils [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:46.700 143781 DEBUG oslo_concurrency.lockutils [req-3cb90110-fd5d-41c4-af29-7404672cb84b c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:46.700 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:46.700 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:46.700 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:47.053 143780 DEBUG oslo_service.periodic_task [req-d08eaa2b-2de2-4573-a0ed-fc01029108bb - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:58:47.057 143780 DEBUG oslo_concurrency.lockutils [req-86c4cbf0-404b-4887-aef0-21449aa876b8 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:47.057 143780 DEBUG oslo_concurrency.lockutils [req-86c4cbf0-404b-4887-aef0-21449aa876b8 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:47.697 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:47.697 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:47.697 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:47.697 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:47.697 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:47.697 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:47.697 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:47.697 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:47.697 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:47.701 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:47.701 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:47.702 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:49.698 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:49.698 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:49.698 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:49.699 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:49.699 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:49.699 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:49.700 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:49.700 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:49.700 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:49.703 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:49.703 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:49.703 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:52.096 143779 DEBUG oslo_service.periodic_task [req-3e644524-950a-44df-a51a-6f316cc91a90 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:58:52.100 143779 DEBUG oslo_concurrency.lockutils [req-cb63410d-0b24-484f-b57c-0a3969a4703e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:52.101 143779 DEBUG oslo_concurrency.lockutils [req-cb63410d-0b24-484f-b57c-0a3969a4703e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:58:53.701 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:53.701 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:53.702 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:53.702 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:53.702 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:53.702 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:53.702 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:53.703 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:53.703 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:53.706 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:58:53.707 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:58:53.707 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:58:57.117 143781 DEBUG oslo_service.periodic_task [req-e6599d2d-def8-4407-8d29-24d14e0b43ca - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:58:57.121 143781 DEBUG oslo_concurrency.lockutils [req-36385d03-7745-4a26-bfb2-1af66c7afd76 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:58:57.121 143781 DEBUG oslo_concurrency.lockutils [req-36385d03-7745-4a26-bfb2-1af66c7afd76 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:01.704 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:01.704 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:01.705 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:01.708 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:01.708 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:01.708 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:01.709 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:01.710 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:01.710 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:01.712 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:01.713 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:01.713 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:04.254 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 3b071a05e8f44a2da68785f0e7388f7d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:59:04.254 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 3b071a05e8f44a2da68785f0e7388f7d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:59:04.254 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 3b071a05e8f44a2da68785f0e7388f7d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:59:04.254 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 3b071a05e8f44a2da68785f0e7388f7d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:59:04.254 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:04.254 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:04.254 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3b071a05e8f44a2da68785f0e7388f7d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:04.254 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3b071a05e8f44a2da68785f0e7388f7d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:04.254 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:04.254 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:04.254 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:04.254 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3b071a05e8f44a2da68785f0e7388f7d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:04.254 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:04.254 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 3b071a05e8f44a2da68785f0e7388f7d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:04.254 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:04.255 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:04.255 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:04.255 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:04.255 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:04.255 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:04.255 143779 DEBUG oslo_concurrency.lockutils [req-9287abc4-7641-4105-bc74-25f91d171a67 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:04.255 143781 DEBUG oslo_concurrency.lockutils [req-9287abc4-7641-4105-bc74-25f91d171a67 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:04.255 143779 DEBUG oslo_concurrency.lockutils [req-9287abc4-7641-4105-bc74-25f91d171a67 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:04.255 143781 DEBUG oslo_concurrency.lockutils [req-9287abc4-7641-4105-bc74-25f91d171a67 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:04.256 143780 DEBUG oslo_concurrency.lockutils [req-9287abc4-7641-4105-bc74-25f91d171a67 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:04.256 143787 DEBUG oslo_concurrency.lockutils [req-9287abc4-7641-4105-bc74-25f91d171a67 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:04.256 143787 DEBUG oslo_concurrency.lockutils [req-9287abc4-7641-4105-bc74-25f91d171a67 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:04.256 143780 DEBUG oslo_concurrency.lockutils [req-9287abc4-7641-4105-bc74-25f91d171a67 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:04.256 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:04.256 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:04.257 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:04.257 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:04.257 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:04.257 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:04.257 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:04.257 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:04.257 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:04.257 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:04.257 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:04.257 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:05.122 143787 DEBUG oslo_service.periodic_task [req-a672c82a-ca50-4b25-97cd-fbf61ba0bd7a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:59:05.126 143787 DEBUG oslo_concurrency.lockutils [req-ba4a33c8-e5aa-480f-a88c-edc00987f3d2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:05.126 143787 DEBUG oslo_concurrency.lockutils [req-ba4a33c8-e5aa-480f-a88c-edc00987f3d2 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:05.258 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:05.258 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:05.258 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:05.258 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:05.258 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:05.258 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:05.258 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:05.258 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:05.258 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:05.258 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:05.259 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:05.259 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:07.259 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:07.259 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:07.259 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:07.260 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:07.260 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:07.260 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:07.260 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:07.260 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:07.260 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:07.260 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:07.260 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:07.260 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:08.044 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: f3540b832dfc4a48a469f9d682011b1b reply to reply_212ff63d4a534c92a13efa0416baac2d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:59:08.045 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:08.045 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: b9fb4040d04a435297d9421b0544ebf4 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:08.046 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:08.046 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:08.048 143780 DEBUG nova.scheduler.manager [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['c7c10f3b-8191-48df-b70f-f8fa83093412'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:59:08.051 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:08.051 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:08.051 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:08.054 143780 DEBUG nova.scheduler.request_filter [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:59:08.055 143780 DEBUG nova.scheduler.request_filter [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:59:08.055 143780 DEBUG nova.scheduler.request_filter [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:59:08.055 143780 DEBUG nova.scheduler.request_filter [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:59:08.055 143780 DEBUG nova.scheduler.request_filter [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:59:08.060 143780 DEBUG oslo_concurrency.lockutils [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:08.061 143780 DEBUG oslo_concurrency.lockutils [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:08.113 143780 DEBUG oslo_messaging._drivers.amqpdriver [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 09d844d09c5c42f3857f27da6b35a01b NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:59:08.115 143780 DEBUG oslo_concurrency.lockutils [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:08.115 143780 DEBUG oslo_concurrency.lockutils [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:08.123 143780 DEBUG oslo_concurrency.lockutils [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:08.123 143780 DEBUG nova.scheduler.host_manager [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=24,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0',io_workload='0',num_instances='0',num_os_type_None='0',num_proj_036dcdad78e649f48fde11092df1abfe='0',num_proj_1bd73ee4907245b7a141592834f2a269='0',num_task_None='0',num_vm_active='0',num_vm_building='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:59:04Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:59:08.124 143780 DEBUG nova.scheduler.host_manager [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:59:08.124 143780 DEBUG nova.scheduler.host_manager [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 479, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 59, 1, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 59, 1, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:59:08.124 143780 DEBUG nova.scheduler.host_manager [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:59:08.124 143780 DEBUG oslo_concurrency.lockutils [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:08.125 143780 INFO nova.scheduler.host_manager [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:59:08.125 143780 DEBUG nova.scheduler.manager [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:59:08.125 143780 DEBUG nova.scheduler.manager [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:59:08.125 143780 DEBUG nova.scheduler.utils [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance c7c10f3b-8191-48df-b70f-f8fa83093412 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:59:08.194 143780 DEBUG nova.scheduler.manager [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: c7c10f3b-8191-48df-b70f-f8fa83093412] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:59:08.195 143780 DEBUG oslo_concurrency.lockutils [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:08.195 143780 DEBUG oslo_concurrency.lockutils [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:08.196 143780 DEBUG oslo_messaging._drivers.amqpdriver [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 2ec37297a6d14083985c84def11c3fc3 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:59:08.198 143780 DEBUG oslo_messaging._drivers.amqpdriver [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: f3540b832dfc4a48a469f9d682011b1b reply queue: reply_212ff63d4a534c92a13efa0416baac2d time elapsed: 0.15316967600028875s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:59:09.053 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:09.053 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:09.053 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:10.970 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1cc7585c8b36456496910f7413a663da __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:59:10.970 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1cc7585c8b36456496910f7413a663da __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:59:10.970 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1cc7585c8b36456496910f7413a663da __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:59:10.970 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:10.970 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1cc7585c8b36456496910f7413a663da __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:59:10.970 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1cc7585c8b36456496910f7413a663da poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:10.970 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:10.970 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:10.970 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:10.970 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1cc7585c8b36456496910f7413a663da poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:10.970 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:10.970 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:10.971 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:10.971 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1cc7585c8b36456496910f7413a663da poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:10.971 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1cc7585c8b36456496910f7413a663da poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:10.971 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:10.971 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:10.971 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:10.971 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:10.971 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:10.972 143779 DEBUG oslo_concurrency.lockutils [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:10.973 143779 DEBUG oslo_concurrency.lockutils [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:10.973 143787 DEBUG oslo_concurrency.lockutils [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:10.973 143780 DEBUG oslo_concurrency.lockutils [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:10.973 143781 DEBUG oslo_concurrency.lockutils [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:10.973 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:10.973 143787 DEBUG oslo_concurrency.lockutils [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:10.973 143780 DEBUG oslo_concurrency.lockutils [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:10.973 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:10.973 143781 DEBUG oslo_concurrency.lockutils [req-130ee31e-8a9d-405b-82c9-db5c0733e7ee c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:10.974 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:10.974 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:10.974 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:10.974 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:10.974 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:10.974 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:10.974 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:10.974 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:10.974 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:10.974 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:11.975 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:11.975 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:11.975 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:11.975 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:11.975 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:11.975 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:11.975 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:11.975 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:11.976 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:11.976 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:11.976 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:11.976 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:13.977 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:13.977 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:13.977 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:13.977 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:13.978 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:13.978 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:13.978 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:13.978 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:13.978 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:13.978 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:13.978 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:13.978 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:17.062 143780 DEBUG oslo_service.periodic_task [req-86c4cbf0-404b-4887-aef0-21449aa876b8 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:59:17.066 143780 DEBUG oslo_concurrency.lockutils [req-6b33c2a1-b490-4d20-96f3-9be8e32bc876 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:17.066 143780 DEBUG oslo_concurrency.lockutils [req-6b33c2a1-b490-4d20-96f3-9be8e32bc876 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:17.979 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:17.979 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:17.979 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:17.980 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:17.979 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:17.980 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:17.980 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:17.980 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:17.980 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:17.980 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:17.981 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:17.981 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:22.429 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 797b5a234d134ac2a945e909c612b16d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:59:22.429 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 797b5a234d134ac2a945e909c612b16d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:59:22.429 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 797b5a234d134ac2a945e909c612b16d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:59:22.430 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:22.430 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:22.430 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:22.430 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 797b5a234d134ac2a945e909c612b16d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:22.430 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 797b5a234d134ac2a945e909c612b16d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:22.430 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 797b5a234d134ac2a945e909c612b16d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:22.430 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:22.430 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:22.430 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:22.430 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:22.430 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:22.430 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:22.431 143780 DEBUG oslo_concurrency.lockutils [req-081ff448-8332-46a8-8d10-263718bfce32 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:22.431 143779 DEBUG oslo_concurrency.lockutils [req-081ff448-8332-46a8-8d10-263718bfce32 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:22.431 143781 DEBUG oslo_concurrency.lockutils [req-081ff448-8332-46a8-8d10-263718bfce32 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:22.431 143780 DEBUG oslo_concurrency.lockutils [req-081ff448-8332-46a8-8d10-263718bfce32 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:22.431 143779 DEBUG oslo_concurrency.lockutils [req-081ff448-8332-46a8-8d10-263718bfce32 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:22.431 143781 DEBUG oslo_concurrency.lockutils [req-081ff448-8332-46a8-8d10-263718bfce32 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:22.432 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 797b5a234d134ac2a945e909c612b16d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:59:22.432 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:22.432 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 797b5a234d134ac2a945e909c612b16d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:22.432 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:22.433 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:22.433 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:22.433 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:22.433 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:22.433 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:22.433 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:22.433 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:22.433 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:22.433 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:22.433 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:22.433 143787 DEBUG oslo_concurrency.lockutils [req-081ff448-8332-46a8-8d10-263718bfce32 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:22.433 143787 DEBUG oslo_concurrency.lockutils [req-081ff448-8332-46a8-8d10-263718bfce32 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:22.435 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:22.435 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:22.435 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:23.057 143779 DEBUG oslo_service.periodic_task [req-cb63410d-0b24-484f-b57c-0a3969a4703e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:59:23.061 143779 DEBUG oslo_concurrency.lockutils [req-84ea9710-cc36-4a45-ad9d-cad838d35dd9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:23.061 143779 DEBUG oslo_concurrency.lockutils [req-84ea9710-cc36-4a45-ad9d-cad838d35dd9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:23.433 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:23.434 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:23.434 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:23.434 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:23.434 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:23.434 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:23.434 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:23.435 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:23.435 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:23.436 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:23.437 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:23.437 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:25.436 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:25.436 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:25.436 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:25.436 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:25.437 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:25.437 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:25.437 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:25.437 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:25.437 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:25.438 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:25.439 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:25.439 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:27.110 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: f4898093c5f643609b4f2950be3b5a8a reply to reply_54bc7bb5014144dab053290b62b8003d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:59:27.110 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:27.110 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f0d7d471e5f64eecbcb378249871a90d poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:27.110 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:27.110 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:27.112 143779 DEBUG nova.scheduler.manager [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['0828f95d-b514-46e9-bf34-c965440c4928'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:59:27.114 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:27.114 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:27.114 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:27.117 143779 DEBUG nova.scheduler.request_filter [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:59:27.118 143779 DEBUG nova.scheduler.request_filter [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:59:27.118 143779 DEBUG nova.scheduler.request_filter [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:59:27.118 143779 DEBUG nova.scheduler.request_filter [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:59:27.552 143779 DEBUG nova.scheduler.request_filter [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.4 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:59:27.557 143779 DEBUG oslo_concurrency.lockutils [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:27.557 143779 DEBUG oslo_concurrency.lockutils [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:27.605 143779 DEBUG oslo_messaging._drivers.amqpdriver [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 2b424f2ca2eb4a53a008139bc061aef1 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:59:27.607 143779 DEBUG oslo_concurrency.lockutils [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:27.607 143779 DEBUG oslo_concurrency.lockutils [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:27.615 143779 DEBUG oslo_concurrency.lockutils [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:27.615 143779 DEBUG nova.scheduler.host_manager [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=24,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0',io_workload='0',num_instances='0',num_os_type_None='0',num_proj_036dcdad78e649f48fde11092df1abfe='0',num_proj_1bd73ee4907245b7a141592834f2a269='0',num_task_None='0',num_vm_active='0',num_vm_building='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:59:22Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:59:27.616 143779 DEBUG nova.scheduler.host_manager [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:59:27.616 143779 DEBUG nova.scheduler.host_manager [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 481, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 59, 21, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 59, 21, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:59:27.616 143779 DEBUG nova.scheduler.host_manager [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:59:27.617 143779 DEBUG oslo_concurrency.lockutils [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:27.617 143779 INFO nova.scheduler.host_manager [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:59:27.617 143779 DEBUG nova.scheduler.manager [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:59:27.617 143779 DEBUG nova.scheduler.manager [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:59:27.618 143779 DEBUG nova.scheduler.utils [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance 0828f95d-b514-46e9-bf34-c965440c4928 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:59:27.678 143779 DEBUG nova.scheduler.manager [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: 0828f95d-b514-46e9-bf34-c965440c4928] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 24576MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:59:27.678 143779 DEBUG oslo_concurrency.lockutils [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:27.679 143779 DEBUG oslo_concurrency.lockutils [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:27.680 143779 DEBUG oslo_messaging._drivers.amqpdriver [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: f7bded63dcac4c7ab4ba16ca5f445cf5 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:59:27.682 143779 DEBUG oslo_messaging._drivers.amqpdriver [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: f4898093c5f643609b4f2950be3b5a8a reply queue: reply_54bc7bb5014144dab053290b62b8003d time elapsed: 0.5722861490003197s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:59:28.037 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 687c1edbdb6540769d8e3854d81bf8b0 reply to reply_212ff63d4a534c92a13efa0416baac2d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:59:28.037 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:28.037 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8aa3f63caa30462f9c3e02f2d48639d1 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:28.038 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:28.038 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:28.039 143787 DEBUG nova.scheduler.manager [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['92c2bce0-3b0d-4f3d-b580-7cda26761e2f'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:59:28.041 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:28.042 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:28.042 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:28.042 143781 DEBUG oslo_service.periodic_task [req-36385d03-7745-4a26-bfb2-1af66c7afd76 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:59:28.045 143787 DEBUG nova.scheduler.request_filter [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:59:28.045 143787 DEBUG nova.scheduler.request_filter [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:59:28.045 143787 DEBUG nova.scheduler.request_filter [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:59:28.045 143787 DEBUG nova.scheduler.request_filter [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:59:28.046 143781 DEBUG oslo_concurrency.lockutils [req-86911c37-d120-4a5a-b955-85c8c3d37a1b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:28.046 143781 DEBUG oslo_concurrency.lockutils [req-86911c37-d120-4a5a-b955-85c8c3d37a1b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:28.116 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:28.116 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:28.116 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:28.167 143787 DEBUG nova.scheduler.request_filter [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.1 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:59:28.173 143787 DEBUG oslo_concurrency.lockutils [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:28.174 143787 DEBUG oslo_concurrency.lockutils [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:28.227 143787 DEBUG oslo_messaging._drivers.amqpdriver [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: c1f1debbb1ed474c91490bdb3fa0fac6 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:59:28.229 143787 DEBUG oslo_concurrency.lockutils [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:28.229 143787 DEBUG oslo_concurrency.lockutils [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:28.240 143787 DEBUG oslo_concurrency.lockutils [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:28.240 143787 DEBUG nova.scheduler.host_manager [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=24,free_disk_gb=76,free_ram_mb=30363,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=1,mapped=1,memory_mb=31899,memory_mb_used=1536,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=1,service_id=None,stats={failed_builds='0',io_workload='1',num_instances='1',num_os_type_None='1',num_proj_036dcdad78e649f48fde11092df1abfe='1',num_proj_1bd73ee4907245b7a141592834f2a269='0',num_task_None='1',num_vm_active='0',num_vm_building='1'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:59:28Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=1) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:59:28.241 143787 DEBUG nova.scheduler.host_manager [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:59:28.241 143787 DEBUG nova.scheduler.host_manager [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 481, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 59, 21, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 59, 21, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:59:28.241 143787 DEBUG nova.scheduler.host_manager [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:59:28.242 143787 DEBUG oslo_concurrency.lockutils [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:28.242 143787 INFO nova.scheduler.host_manager [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:59:28.242 143787 DEBUG nova.scheduler.manager [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 24576MB io_ops: 1 instances: 1]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:59:28.242 143787 DEBUG nova.scheduler.manager [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 24576MB io_ops: 1 instances: 1, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:59:28.242 143787 DEBUG nova.scheduler.utils [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance 92c2bce0-3b0d-4f3d-b580-7cda26761e2f claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:59:28.308 143787 DEBUG nova.scheduler.manager [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: 92c2bce0-3b0d-4f3d-b580-7cda26761e2f] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 30363MB disk: 24576MB io_ops: 1 instances: 1 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:59:28.308 143787 DEBUG oslo_concurrency.lockutils [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:28.309 143787 DEBUG oslo_concurrency.lockutils [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:28.310 143787 DEBUG oslo_messaging._drivers.amqpdriver [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 91e4b23bf8e44c7fa38400d4cb8c6b0c NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:59:28.312 143787 DEBUG oslo_messaging._drivers.amqpdriver [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: 687c1edbdb6540769d8e3854d81bf8b0 reply queue: reply_212ff63d4a534c92a13efa0416baac2d time elapsed: 0.2744740170001023s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:59:29.043 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:29.043 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:29.044 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:29.300 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: 2a594571fa53429a921a7ad70b10647b reply to reply_c93f245d97184cab8c1af1830c6ac100 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 01:59:29.301 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:29.301 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: f2b603128cb448f5992f4c68e62cef5b poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:29.301 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:29.301 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:29.303 143781 DEBUG nova.scheduler.manager [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['f8e7bbbc-d63e-448f-b4de-4b5aa69d51c8'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 01:59:29.304 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:29.305 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:29.305 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:29.308 143781 DEBUG nova.scheduler.request_filter [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'map_az_to_placement_aggregate' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:59:29.309 143781 DEBUG nova.scheduler.request_filter [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 01:59:29.309 143781 DEBUG nova.scheduler.request_filter [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:59:29.309 143781 DEBUG nova.scheduler.request_filter [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:59:29.309 143781 DEBUG nova.scheduler.request_filter [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 01:59:29.314 143781 DEBUG oslo_concurrency.lockutils [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:29.314 143781 DEBUG oslo_concurrency.lockutils [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:29.371 143781 DEBUG oslo_messaging._drivers.amqpdriver [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 90a75a02f2fb4e76b960780bf1eb0763 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:59:29.373 143781 DEBUG oslo_concurrency.lockutils [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:29.373 143781 DEBUG oslo_concurrency.lockutils [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:29.381 143781 DEBUG oslo_concurrency.lockutils [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:29.381 143781 DEBUG nova.scheduler.host_manager [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=24,free_disk_gb=75,free_ram_mb=29339,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=2,mapped=1,memory_mb=31899,memory_mb_used=2560,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=2,service_id=None,stats={failed_builds='0',io_workload='2',num_instances='2',num_os_type_None='2',num_proj_036dcdad78e649f48fde11092df1abfe='2',num_proj_1bd73ee4907245b7a141592834f2a269='0',num_task_None='2',num_vm_active='0',num_vm_building='2'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T01:59:29Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=2) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 01:59:29.383 143781 DEBUG nova.scheduler.host_manager [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 01:59:29.383 143781 DEBUG nova.scheduler.host_manager [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 481, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 1, 59, 21, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 1, 59, 21, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 01:59:29.383 143781 DEBUG nova.scheduler.host_manager [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 01:59:29.383 143781 DEBUG oslo_concurrency.lockutils [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:29.384 143781 INFO nova.scheduler.host_manager [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Host filter forcing available hosts to cn-jenkins-deploy-platform-juju-os-690-1 2026-04-02 01:59:29.384 143781 DEBUG nova.scheduler.manager [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered dict_values([(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 29339MB disk: 24576MB io_ops: 2 instances: 2]) _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 01:59:29.384 143781 DEBUG nova.scheduler.manager [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 29339MB disk: 24576MB io_ops: 2 instances: 2, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 01:59:29.384 143781 DEBUG nova.scheduler.utils [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance f8e7bbbc-d63e-448f-b4de-4b5aa69d51c8 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 01:59:29.440 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:29.440 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:29.440 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:29.452 143781 DEBUG nova.scheduler.manager [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: f8e7bbbc-d63e-448f-b4de-4b5aa69d51c8] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 29339MB disk: 24576MB io_ops: 2 instances: 2 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 01:59:29.452 143781 DEBUG oslo_concurrency.lockutils [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:29.453 143781 DEBUG oslo_concurrency.lockutils [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:29.454 143781 DEBUG oslo_messaging._drivers.amqpdriver [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 0433728dbbd540c89cc2667e5ba1e7e7 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 01:59:29.455 143781 DEBUG oslo_messaging._drivers.amqpdriver [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: 2a594571fa53429a921a7ad70b10647b reply queue: reply_c93f245d97184cab8c1af1830c6ac100 time elapsed: 0.15459204400031012s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 01:59:30.118 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:30.118 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:30.118 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:30.306 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:30.306 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:30.306 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:31.046 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:31.046 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.046 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:31.307 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8a749e131e5a4cf9ac9d72990364a9ed __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:59:31.307 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8a749e131e5a4cf9ac9d72990364a9ed __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:59:31.307 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8a749e131e5a4cf9ac9d72990364a9ed __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:59:31.307 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.307 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.307 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8a749e131e5a4cf9ac9d72990364a9ed poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:31.307 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8a749e131e5a4cf9ac9d72990364a9ed poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:31.307 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.308 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.308 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.308 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:31.308 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8a749e131e5a4cf9ac9d72990364a9ed poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:31.308 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:31.308 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.308 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:31.309 143780 DEBUG oslo_concurrency.lockutils [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:31.310 143780 DEBUG oslo_concurrency.lockutils [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:31.310 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:31.310 143779 DEBUG oslo_concurrency.lockutils [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:31.310 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.310 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:31.310 143779 DEBUG oslo_concurrency.lockutils [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:31.310 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 8a749e131e5a4cf9ac9d72990364a9ed __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:59:31.311 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:31.311 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.311 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:31.312 143787 DEBUG oslo_concurrency.lockutils [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:31.311 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.313 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 8a749e131e5a4cf9ac9d72990364a9ed poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:31.313 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.313 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:31.313 143787 DEBUG oslo_concurrency.lockutils [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:31.315 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:31.317 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.317 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:31.316 143781 DEBUG oslo_concurrency.lockutils [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:31.317 143781 DEBUG oslo_concurrency.lockutils [req-530152c0-6679-4db3-96d0-fb6693ad84ad c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:31.318 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:31.318 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.318 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:31.447 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c445421ffe844c7ea8a1a8b9bfef4c3a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:59:31.447 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c445421ffe844c7ea8a1a8b9bfef4c3a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:59:31.447 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c445421ffe844c7ea8a1a8b9bfef4c3a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:59:31.447 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.447 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.447 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c445421ffe844c7ea8a1a8b9bfef4c3a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:31.447 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c445421ffe844c7ea8a1a8b9bfef4c3a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:31.447 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.448 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c445421ffe844c7ea8a1a8b9bfef4c3a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:31.448 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.448 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.448 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:31.448 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:31.448 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.448 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:31.449 143780 DEBUG oslo_concurrency.lockutils [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:31.450 143780 DEBUG oslo_concurrency.lockutils [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:31.450 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:31.450 143787 DEBUG oslo_concurrency.lockutils [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:31.450 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.450 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:31.450 143787 DEBUG oslo_concurrency.lockutils [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:31.451 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:31.451 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.451 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:31.452 143781 DEBUG oslo_concurrency.lockutils [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:31.452 143781 DEBUG oslo_concurrency.lockutils [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:31.453 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:31.453 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.453 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:31.454 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: c445421ffe844c7ea8a1a8b9bfef4c3a __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:59:31.455 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.455 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: c445421ffe844c7ea8a1a8b9bfef4c3a poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:31.455 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.455 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:31.459 143779 DEBUG oslo_concurrency.lockutils [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:31.459 143779 DEBUG oslo_concurrency.lockutils [req-92209b5c-253f-44b2-beda-f37776c2af83 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:31.459 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:31.459 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:31.460 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:32.452 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:32.452 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:32.452 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:32.452 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:32.453 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:32.453 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:32.454 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:32.454 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:32.455 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:32.461 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:32.461 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:32.461 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:32.568 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: bc3b0c1954f64b53b727ba7a920abac0 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:59:32.568 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: bc3b0c1954f64b53b727ba7a920abac0 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:59:32.568 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: bc3b0c1954f64b53b727ba7a920abac0 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:59:32.569 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:32.569 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:32.569 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: bc3b0c1954f64b53b727ba7a920abac0 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:32.569 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:32.569 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: bc3b0c1954f64b53b727ba7a920abac0 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:32.569 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: bc3b0c1954f64b53b727ba7a920abac0 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:32.569 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:32.569 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:32.569 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:32.569 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:32.569 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:32.569 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:32.572 143780 DEBUG oslo_concurrency.lockutils [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:32.572 143781 DEBUG oslo_concurrency.lockutils [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:32.572 143780 DEBUG oslo_concurrency.lockutils [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:32.572 143781 DEBUG oslo_concurrency.lockutils [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:32.572 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: bc3b0c1954f64b53b727ba7a920abac0 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 01:59:32.572 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:32.572 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:32.572 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:32.573 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: bc3b0c1954f64b53b727ba7a920abac0 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 01:59:32.572 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:32.573 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:32.573 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:32.573 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:32.573 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:32.573 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:32.575 143779 DEBUG oslo_concurrency.lockutils [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:32.571 143787 DEBUG oslo_concurrency.lockutils [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:32.576 143787 DEBUG oslo_concurrency.lockutils [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.005s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:32.576 143779 DEBUG oslo_concurrency.lockutils [req-ce1e0d5e-a840-450a-bbb2-22da4ce517dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:32.577 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:32.577 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:32.577 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:32.576 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:32.577 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:32.577 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:33.573 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:33.574 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:33.574 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:33.574 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:33.574 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:33.574 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:33.578 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:33.578 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:33.578 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:33.578 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:33.579 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:33.579 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:35.576 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:35.576 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:35.576 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:35.576 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:35.576 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:35.577 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:35.579 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:35.580 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:35.580 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:35.580 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:35.581 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:35.581 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:36.083 143787 DEBUG oslo_service.periodic_task [req-ba4a33c8-e5aa-480f-a88c-edc00987f3d2 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:59:36.088 143787 DEBUG oslo_concurrency.lockutils [req-b1e564eb-fd85-4069-b3be-f9547110b473 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:36.088 143787 DEBUG oslo_concurrency.lockutils [req-b1e564eb-fd85-4069-b3be-f9547110b473 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:39.578 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:39.579 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:39.579 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:39.580 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:39.581 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:39.581 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:39.581 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:39.582 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:39.582 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:39.583 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:39.583 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:39.583 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:47.072 143780 DEBUG oslo_service.periodic_task [req-6b33c2a1-b490-4d20-96f3-9be8e32bc876 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:59:47.076 143780 DEBUG oslo_concurrency.lockutils [req-fca65ffb-6697-4532-aec0-0e8d2b0480a9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:47.076 143780 DEBUG oslo_concurrency.lockutils [req-fca65ffb-6697-4532-aec0-0e8d2b0480a9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:47.580 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:47.580 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:47.580 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:47.582 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:47.582 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:47.582 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:47.583 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:47.583 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:47.583 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:47.585 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 01:59:47.585 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 01:59:47.585 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 01:59:54.057 143779 DEBUG oslo_service.periodic_task [req-84ea9710-cc36-4a45-ad9d-cad838d35dd9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:59:54.061 143779 DEBUG oslo_concurrency.lockutils [req-12f977ea-05b2-46ad-a633-c33eae3d5118 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:54.061 143779 DEBUG oslo_concurrency.lockutils [req-12f977ea-05b2-46ad-a633-c33eae3d5118 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 01:59:59.042 143781 DEBUG oslo_service.periodic_task [req-86911c37-d120-4a5a-b955-85c8c3d37a1b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 01:59:59.046 143781 DEBUG oslo_concurrency.lockutils [req-b8451f23-125f-4ebc-b00e-832f7071253f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 01:59:59.046 143781 DEBUG oslo_concurrency.lockutils [req-b8451f23-125f-4ebc-b00e-832f7071253f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:03.582 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:03.583 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:03.583 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:03.585 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:03.585 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:03.585 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:03.585 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:03.586 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:03.586 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:03.586 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:03.587 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:03.587 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:03.762 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1df59dfc4b3a4070992cbed1792a4a19 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:00:03.762 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1df59dfc4b3a4070992cbed1792a4a19 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:00:03.762 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1df59dfc4b3a4070992cbed1792a4a19 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:00:03.762 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 1df59dfc4b3a4070992cbed1792a4a19 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:00:03.762 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:03.762 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:03.762 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:03.762 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1df59dfc4b3a4070992cbed1792a4a19 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:00:03.762 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1df59dfc4b3a4070992cbed1792a4a19 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:00:03.762 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1df59dfc4b3a4070992cbed1792a4a19 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:00:03.762 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:03.763 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:03.763 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:03.763 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:03.762 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 1df59dfc4b3a4070992cbed1792a4a19 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:00:03.763 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:03.763 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:03.763 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:03.763 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:03.763 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:03.763 143780 DEBUG oslo_concurrency.lockutils [req-18b86c9e-8bb3-417b-89b7-94d48716be3f c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:03.763 143787 DEBUG oslo_concurrency.lockutils [req-18b86c9e-8bb3-417b-89b7-94d48716be3f c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:03.763 143780 DEBUG oslo_concurrency.lockutils [req-18b86c9e-8bb3-417b-89b7-94d48716be3f c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:03.763 143787 DEBUG oslo_concurrency.lockutils [req-18b86c9e-8bb3-417b-89b7-94d48716be3f c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:03.763 143779 DEBUG oslo_concurrency.lockutils [req-18b86c9e-8bb3-417b-89b7-94d48716be3f c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:03.763 143781 DEBUG oslo_concurrency.lockutils [req-18b86c9e-8bb3-417b-89b7-94d48716be3f c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:03.764 143779 DEBUG oslo_concurrency.lockutils [req-18b86c9e-8bb3-417b-89b7-94d48716be3f c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:03.764 143781 DEBUG oslo_concurrency.lockutils [req-18b86c9e-8bb3-417b-89b7-94d48716be3f c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:03.764 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:03.764 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:03.764 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:03.765 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:03.765 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:03.765 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:03.765 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:03.765 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:03.765 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:03.765 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:03.765 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:03.765 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:04.766 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:04.766 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:04.766 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:04.767 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:04.767 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:04.767 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:04.767 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:04.767 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:04.767 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:04.767 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:04.767 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:04.767 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:06.093 143787 DEBUG oslo_service.periodic_task [req-b1e564eb-fd85-4069-b3be-f9547110b473 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:00:06.097 143787 DEBUG oslo_concurrency.lockutils [req-daaf5e52-a1c1-41dc-bf4a-ba745d450aec - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:06.097 143787 DEBUG oslo_concurrency.lockutils [req-daaf5e52-a1c1-41dc-bf4a-ba745d450aec - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:06.769 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:06.769 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:06.769 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:06.769 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:06.769 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:06.769 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:06.769 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:06.769 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:06.769 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:06.769 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:06.769 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:06.769 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:08.518 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ba54d1ded58d49e4bbdcfcb2aa3bbefa __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:00:08.518 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ba54d1ded58d49e4bbdcfcb2aa3bbefa __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:00:08.518 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ba54d1ded58d49e4bbdcfcb2aa3bbefa __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:00:08.518 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: ba54d1ded58d49e4bbdcfcb2aa3bbefa __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:00:08.518 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:08.518 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:08.518 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:08.518 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:08.518 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ba54d1ded58d49e4bbdcfcb2aa3bbefa poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:00:08.518 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ba54d1ded58d49e4bbdcfcb2aa3bbefa poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:00:08.518 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ba54d1ded58d49e4bbdcfcb2aa3bbefa poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:00:08.518 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: ba54d1ded58d49e4bbdcfcb2aa3bbefa poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:00:08.518 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:08.518 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:08.518 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:08.518 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:08.519 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:08.519 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:08.519 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:08.519 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:08.519 143780 DEBUG oslo_concurrency.lockutils [req-7c1e97e7-ecfb-45ba-bd64-35ad4652ded0 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:08.519 143779 DEBUG oslo_concurrency.lockutils [req-7c1e97e7-ecfb-45ba-bd64-35ad4652ded0 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:08.519 143781 DEBUG oslo_concurrency.lockutils [req-7c1e97e7-ecfb-45ba-bd64-35ad4652ded0 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:08.519 143787 DEBUG oslo_concurrency.lockutils [req-7c1e97e7-ecfb-45ba-bd64-35ad4652ded0 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:08.519 143780 DEBUG oslo_concurrency.lockutils [req-7c1e97e7-ecfb-45ba-bd64-35ad4652ded0 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:08.519 143779 DEBUG oslo_concurrency.lockutils [req-7c1e97e7-ecfb-45ba-bd64-35ad4652ded0 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:08.519 143787 DEBUG oslo_concurrency.lockutils [req-7c1e97e7-ecfb-45ba-bd64-35ad4652ded0 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:08.519 143781 DEBUG oslo_concurrency.lockutils [req-7c1e97e7-ecfb-45ba-bd64-35ad4652ded0 c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:08.520 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:08.520 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:08.520 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:08.521 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:08.521 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:08.521 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:08.521 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:08.521 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:08.521 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:08.521 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:08.521 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:08.521 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:08.558 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 072f76d669f042f8b3ddb6831bf34d24 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:00:08.558 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 072f76d669f042f8b3ddb6831bf34d24 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:00:08.558 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 072f76d669f042f8b3ddb6831bf34d24 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:00:08.558 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 072f76d669f042f8b3ddb6831bf34d24 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:00:08.558 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:08.558 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:08.558 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:08.558 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 072f76d669f042f8b3ddb6831bf34d24 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:00:08.558 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 072f76d669f042f8b3ddb6831bf34d24 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:00:08.558 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:08.558 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 072f76d669f042f8b3ddb6831bf34d24 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:00:08.558 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 072f76d669f042f8b3ddb6831bf34d24 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:00:08.558 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:08.558 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:08.558 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:08.558 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:08.558 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:08.558 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:08.558 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:08.559 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:08.559 143781 DEBUG oslo_concurrency.lockutils [req-a1d9e012-784a-42c0-b8d5-df05ec6404dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:08.559 143780 DEBUG oslo_concurrency.lockutils [req-a1d9e012-784a-42c0-b8d5-df05ec6404dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:08.559 143779 DEBUG oslo_concurrency.lockutils [req-a1d9e012-784a-42c0-b8d5-df05ec6404dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:08.559 143781 DEBUG oslo_concurrency.lockutils [req-a1d9e012-784a-42c0-b8d5-df05ec6404dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:08.559 143780 DEBUG oslo_concurrency.lockutils [req-a1d9e012-784a-42c0-b8d5-df05ec6404dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:08.559 143787 DEBUG oslo_concurrency.lockutils [req-a1d9e012-784a-42c0-b8d5-df05ec6404dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:08.559 143779 DEBUG oslo_concurrency.lockutils [req-a1d9e012-784a-42c0-b8d5-df05ec6404dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:08.559 143787 DEBUG oslo_concurrency.lockutils [req-a1d9e012-784a-42c0-b8d5-df05ec6404dd c9620214d1db4086ae16eb97d3c51c72 036dcdad78e649f48fde11092df1abfe - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:08.561 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:08.561 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:08.562 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:08.562 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:08.562 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:08.562 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:08.561 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:08.561 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:08.562 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:08.562 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:08.563 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:08.564 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:09.562 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:09.563 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:09.563 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:09.563 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:09.563 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:09.563 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:09.564 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:09.564 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:09.564 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:09.565 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:09.565 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:09.565 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:11.565 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:11.565 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:11.565 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:11.565 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:11.565 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:11.565 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:11.566 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:11.567 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:11.567 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:11.567 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:11.567 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:11.567 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:15.568 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:15.568 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:15.568 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:15.569 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:15.570 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:15.570 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:15.570 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:15.570 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:15.570 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:15.571 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:15.571 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:15.571 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:17.082 143780 DEBUG oslo_service.periodic_task [req-fca65ffb-6697-4532-aec0-0e8d2b0480a9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:00:17.089 143780 DEBUG oslo_concurrency.lockutils [req-9137505b-80fa-415d-a085-0178bcc81683 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:17.090 143780 DEBUG oslo_concurrency.lockutils [req-9137505b-80fa-415d-a085-0178bcc81683 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:23.574 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:23.575 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:23.575 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:23.576 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:23.576 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:23.576 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:23.576 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:23.576 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:23.576 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:23.577 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:23.577 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:23.577 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:24.066 143779 DEBUG oslo_service.periodic_task [req-12f977ea-05b2-46ad-a633-c33eae3d5118 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:00:24.070 143779 DEBUG oslo_concurrency.lockutils [req-893745f5-9dc2-47d1-9cce-cdfdc6775be5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:24.071 143779 DEBUG oslo_concurrency.lockutils [req-893745f5-9dc2-47d1-9cce-cdfdc6775be5 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:27.615 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4b0fd227411b4721a71c22550c5affa8 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:00:27.615 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4b0fd227411b4721a71c22550c5affa8 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:00:27.615 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4b0fd227411b4721a71c22550c5affa8 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:00:27.615 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 4b0fd227411b4721a71c22550c5affa8 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:00:27.615 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:27.615 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:27.615 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:27.616 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:27.616 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4b0fd227411b4721a71c22550c5affa8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:00:27.616 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4b0fd227411b4721a71c22550c5affa8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:00:27.616 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4b0fd227411b4721a71c22550c5affa8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:00:27.616 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 4b0fd227411b4721a71c22550c5affa8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:00:27.616 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:27.616 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:27.616 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:27.616 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:27.616 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:27.616 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:27.616 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:27.616 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:27.617 143781 DEBUG oslo_concurrency.lockutils [req-74dcdb42-65e4-4f83-9932-a81ab6dbb580 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:27.617 143779 DEBUG oslo_concurrency.lockutils [req-74dcdb42-65e4-4f83-9932-a81ab6dbb580 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:27.617 143780 DEBUG oslo_concurrency.lockutils [req-74dcdb42-65e4-4f83-9932-a81ab6dbb580 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:27.617 143781 DEBUG nova.scheduler.host_manager [req-74dcdb42-65e4-4f83-9932-a81ab6dbb580 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 02:00:27.617 143779 DEBUG nova.scheduler.host_manager [req-74dcdb42-65e4-4f83-9932-a81ab6dbb580 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 02:00:27.617 143787 DEBUG oslo_concurrency.lockutils [req-74dcdb42-65e4-4f83-9932-a81ab6dbb580 - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:27.617 143781 DEBUG oslo_concurrency.lockutils [req-74dcdb42-65e4-4f83-9932-a81ab6dbb580 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:27.617 143787 DEBUG nova.scheduler.host_manager [req-74dcdb42-65e4-4f83-9932-a81ab6dbb580 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 02:00:27.617 143780 DEBUG nova.scheduler.host_manager [req-74dcdb42-65e4-4f83-9932-a81ab6dbb580 - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 02:00:27.617 143779 DEBUG oslo_concurrency.lockutils [req-74dcdb42-65e4-4f83-9932-a81ab6dbb580 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:27.617 143787 DEBUG oslo_concurrency.lockutils [req-74dcdb42-65e4-4f83-9932-a81ab6dbb580 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:27.617 143780 DEBUG oslo_concurrency.lockutils [req-74dcdb42-65e4-4f83-9932-a81ab6dbb580 - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:27.617 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:27.618 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:27.618 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:27.618 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:27.618 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:27.618 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:27.618 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:27.618 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:27.618 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:27.618 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:27.618 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:27.618 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:28.618 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:28.619 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:28.619 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:28.619 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:28.619 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:28.619 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:28.620 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:28.620 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:28.620 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:28.620 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:28.620 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:28.620 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:29.051 143781 DEBUG oslo_service.periodic_task [req-b8451f23-125f-4ebc-b00e-832f7071253f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:00:29.055 143781 DEBUG oslo_concurrency.lockutils [req-144b4892-41bb-4a25-a86e-2d66f2f0c17d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:29.056 143781 DEBUG oslo_concurrency.lockutils [req-144b4892-41bb-4a25-a86e-2d66f2f0c17d - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:30.621 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:30.621 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:30.621 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:30.621 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:30.621 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:30.621 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:30.622 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:30.622 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:30.622 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:30.622 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:30.622 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:30.622 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:34.624 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:34.624 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:34.624 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:34.625 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:34.625 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:34.626 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:34.626 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:34.626 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:34.626 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:34.626 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:34.626 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:34.626 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:35.421 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message msg_id: fe8cbae1b9f74fe58efcf7029e076f5e reply to reply_212ff63d4a534c92a13efa0416baac2d __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:320 2026-04-02 02:00:35.421 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:35.421 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 041dc5aa6aa7498c85a6f1e4d09bc5b9 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:00:35.421 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:35.422 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:35.424 143780 DEBUG nova.scheduler.manager [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting to schedule for instances: ['e9b4cb39-d3ec-47eb-aca8-b78d29cac6f9'] select_destinations /usr/lib/python3/dist-packages/nova/scheduler/manager.py:141 2026-04-02 02:00:35.424 143780 DEBUG nova.scheduler.request_filter [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] compute_status_filter request filter added forbidden trait COMPUTE_STATUS_DISABLED compute_status_filter /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:254 2026-04-02 02:00:35.424 143780 DEBUG nova.scheduler.request_filter [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'compute_status_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 02:00:35.425 143780 DEBUG nova.scheduler.request_filter [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'accelerators_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 02:00:35.425 143780 DEBUG nova.scheduler.request_filter [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Request filter 'remote_managed_ports_filter' took 0.0 seconds wrapper /usr/lib/python3/dist-packages/nova/scheduler/request_filter.py:46 2026-04-02 02:00:35.429 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:35.429 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:35.429 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:35.477 143780 DEBUG oslo_messaging._drivers.amqpdriver [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: f91e2b1240d94d299cd49b01610537f8 NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 02:00:35.479 143780 DEBUG oslo_concurrency.lockutils [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:35.479 143780 DEBUG oslo_concurrency.lockutils [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:35.488 143780 DEBUG oslo_concurrency.lockutils [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.update.._locked_update" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:35.488 143780 DEBUG nova.scheduler.host_manager [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state from compute node: ComputeNode(cpu_allocation_ratio=2.0,cpu_info='{"arch": "x86_64", "model": "Broadwell-IBRS", "vendor": "Intel", "topology": {"cells": 1, "sockets": 1, "cores": 8, "threads": 1}, "features": ["popcnt", "avx512vnni", "avx512cd", "fpu", "lm", "mmx", "pae", "hle", "tsc-deadline", "xsavec", "avx512-vpopcntdq", "abm", "movbe", "apic", "de", "cmov", "cx8", "avx", "avx512f", "nx", "bmi1", "rdtscp", "avx512dq", "invpcid", "pat", "pni", "sse4.2", "wbnoinvd", "lahf_lm", "rdrand", "xsave", "erms", "clflush", "syscall", "fma", "f16c", "gfni", "smap", "pclmuldq", "rtm", "smep", "sse4.1", "spec-ctrl", "aes", "mca", "pdpe1gb", "pse36", "cx16", "pge", "sep", "xsaveopt", "rdseed", "tsc", "x2apic", "xgetbv1", "pku", "avx512bitalg", "hypervisor", "avx512vl", "clflushopt", "ssse3", "la57", "clwb", "mtrr", "adx", "avx512bw", "vme", "arat", "fsgsbase", "avx512vbmi", "bmi2", "msr", "umip", "fxsr", "ssbd", "sse", "pcid", "vpclmulqdq", "ht", "avx2", "vaes", "3dnowprefetch", "sse2", "avx512vbmi2", "pse", "mce"]}',created_at=2026-04-02T00:38:54Z,current_workload=0,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=23,free_disk_gb=77,free_ram_mb=31387,host='cn-jenkins-deploy-platform-juju-os-690-1',host_ip=10.0.0.41,hypervisor_hostname='cn-jenkins-deploy-platform-juju-os-690-1',hypervisor_type='QEMU',hypervisor_version=6002000,id=1,local_gb=77,local_gb_used=0,mapped=1,memory_mb=31899,memory_mb_used=512,metrics='[]',numa_topology='{"nova_object.name": "NUMATopology", "nova_object.namespace": "nova", "nova_object.version": "1.2", "nova_object.data": {"cells": [{"nova_object.name": "NUMACell", "nova_object.namespace": "nova", "nova_object.version": "1.5", "nova_object.data": {"id": 0, "cpuset": [0, 1, 2, 3, 4, 5, 6, 7], "pcpuset": [0, 1, 2, 3, 4, 5, 6, 7], "memory": 31899, "cpu_usage": 0, "memory_usage": 0, "pinned_cpus": [], "siblings": [[6], [2], [5], [4], [1], [7], [0], [3]], "mempages": [{"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 4, "total": 8035206, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 2048, "total": 256, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}, {"nova_object.name": "NUMAPagesTopology", "nova_object.namespace": "nova", "nova_object.version": "1.1", "nova_object.data": {"size_kb": 1048576, "total": 0, "used": 0, "reserved": 0}, "nova_object.changes": ["total", "size_kb", "used", "reserved"]}], "network_metadata": {"nova_object.name": "NetworkMetadata", "nova_object.namespace": "nova", "nova_object.version": "1.0", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.changes": ["tunneled", "physnets"]}, "socket": 0}, "nova_object.changes": ["socket", "memory_usage", "network_metadata", "memory", "id", "pinned_cpus", "siblings", "cpuset", "pcpuset", "cpu_usage", "mempages"]}]}, "nova_object.changes": ["cells"]}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=0.98,running_vms=0,service_id=None,stats={failed_builds='0',io_workload='0',num_instances='0',num_os_type_None='0',num_proj_036dcdad78e649f48fde11092df1abfe='0',num_task_None='0',num_vm_active='0'},supported_hv_specs=[HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec,HVSpec],updated_at=2026-04-02T02:00:08Z,uuid=7e276f57-e8d5-45af-a099-7c5d9d12ef48,vcpus=8,vcpus_used=0) _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:167 2026-04-02 02:00:35.491 143780 DEBUG nova.scheduler.host_manager [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with aggregates: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:170 2026-04-02 02:00:35.491 143780 DEBUG nova.scheduler.host_manager [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with service dict: {'id': 9, 'uuid': 'd7ce5e98-d5b3-435b-af07-7d8248882ded', 'host': 'cn-jenkins-deploy-platform-juju-os-690-1', 'binary': 'nova-compute', 'topic': 'compute', 'report_count': 488, 'disabled': False, 'disabled_reason': None, 'last_seen_up': datetime.datetime(2026, 4, 2, 2, 0, 31, tzinfo=datetime.timezone.utc), 'forced_down': False, 'version': 61, 'created_at': datetime.datetime(2026, 4, 2, 0, 38, 54, tzinfo=datetime.timezone.utc), 'updated_at': datetime.datetime(2026, 4, 2, 2, 0, 31, tzinfo=datetime.timezone.utc), 'deleted_at': None, 'deleted': False} _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:173 2026-04-02 02:00:35.491 143780 DEBUG nova.scheduler.host_manager [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Update host state with instances: [] _locked_update /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:176 2026-04-02 02:00:35.491 143780 DEBUG oslo_concurrency.lockutils [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.update.._locked_update" :: held 0.004s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:35.491 143780 DEBUG nova.filters [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Starting with 1 host(s) get_filtered_objects /usr/lib/python3/dist-packages/nova/filters.py:70 2026-04-02 02:00:35.492 143780 DEBUG nova.filters [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filter AvailabilityZoneFilter returned 1 host(s) get_filtered_objects /usr/lib/python3/dist-packages/nova/filters.py:102 2026-04-02 02:00:35.492 143780 DEBUG nova.filters [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filter ComputeFilter returned 1 host(s) get_filtered_objects /usr/lib/python3/dist-packages/nova/filters.py:102 2026-04-02 02:00:35.492 143780 DEBUG nova.filters [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filter ComputeCapabilitiesFilter returned 1 host(s) get_filtered_objects /usr/lib/python3/dist-packages/nova/filters.py:102 2026-04-02 02:00:35.493 143780 DEBUG nova.filters [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filter ImagePropertiesFilter returned 1 host(s) get_filtered_objects /usr/lib/python3/dist-packages/nova/filters.py:102 2026-04-02 02:00:35.494 143780 DEBUG nova.filters [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filter ServerGroupAntiAffinityFilter returned 1 host(s) get_filtered_objects /usr/lib/python3/dist-packages/nova/filters.py:102 2026-04-02 02:00:35.494 143780 DEBUG nova.filters [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filter ServerGroupAffinityFilter returned 1 host(s) get_filtered_objects /usr/lib/python3/dist-packages/nova/filters.py:102 2026-04-02 02:00:35.494 143780 DEBUG nova.filters [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filter DifferentHostFilter returned 1 host(s) get_filtered_objects /usr/lib/python3/dist-packages/nova/filters.py:102 2026-04-02 02:00:35.494 143780 DEBUG nova.filters [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filter SameHostFilter returned 1 host(s) get_filtered_objects /usr/lib/python3/dist-packages/nova/filters.py:102 2026-04-02 02:00:35.494 143780 DEBUG nova.scheduler.manager [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Filtered [(cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 23552MB io_ops: 0 instances: 0] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:610 2026-04-02 02:00:35.495 143780 DEBUG nova.scheduler.manager [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Weighed [WeighedHost [host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 23552MB io_ops: 0 instances: 0, weight: 0.0]] _get_sorted_hosts /usr/lib/python3/dist-packages/nova/scheduler/manager.py:632 2026-04-02 02:00:35.495 143780 DEBUG nova.scheduler.utils [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Attempting to claim resources in the placement API for instance e9b4cb39-d3ec-47eb-aca8-b78d29cac6f9 claim_resources /usr/lib/python3/dist-packages/nova/scheduler/utils.py:1253 2026-04-02 02:00:35.583 143780 DEBUG nova.scheduler.manager [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] [instance: e9b4cb39-d3ec-47eb-aca8-b78d29cac6f9] Selected host: (cn-jenkins-deploy-platform-juju-os-690-1, cn-jenkins-deploy-platform-juju-os-690-1) ram: 31387MB disk: 23552MB io_ops: 0 instances: 0 _consume_selected_host /usr/lib/python3/dist-packages/nova/scheduler/manager.py:513 2026-04-02 02:00:35.584 143780 DEBUG oslo_concurrency.lockutils [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" acquired by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:35.584 143780 DEBUG oslo_concurrency.lockutils [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "('cn-jenkins-deploy-platform-juju-os-690-1', 'cn-jenkins-deploy-platform-juju-os-690-1')" "released" by "nova.scheduler.host_manager.HostState.consume_from_request.._locked" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:35.585 143780 DEBUG oslo_messaging._drivers.amqpdriver [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] CAST unique_id: 5929547cf3714066bd0d0672a9c38c9f NOTIFY exchange 'nova' topic 'notifications.info' _send /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:656 2026-04-02 02:00:35.587 143780 DEBUG oslo_messaging._drivers.amqpdriver [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] sending reply msg_id: fe8cbae1b9f74fe58efcf7029e076f5e reply queue: reply_212ff63d4a534c92a13efa0416baac2d time elapsed: 0.16622774999996182s _send_reply /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:118 2026-04-02 02:00:36.103 143787 DEBUG oslo_service.periodic_task [req-daaf5e52-a1c1-41dc-bf4a-ba745d450aec - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:00:36.109 143787 DEBUG oslo_concurrency.lockutils [req-d0d27435-52f5-4c34-9656-9453707d6c5b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:36.110 143787 DEBUG oslo_concurrency.lockutils [req-d0d27435-52f5-4c34-9656-9453707d6c5b - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:36.430 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:36.430 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:36.431 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:38.433 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:38.433 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:38.433 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:38.878 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 80d35cf23fb64154b9d2c1db436e39f6 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:00:38.878 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 80d35cf23fb64154b9d2c1db436e39f6 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:00:38.878 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 80d35cf23fb64154b9d2c1db436e39f6 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:00:38.879 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:38.879 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:38.879 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 80d35cf23fb64154b9d2c1db436e39f6 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:00:38.879 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:38.879 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 80d35cf23fb64154b9d2c1db436e39f6 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:00:38.879 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 80d35cf23fb64154b9d2c1db436e39f6 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:00:38.879 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:38.879 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:38.879 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:38.879 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:38.879 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:38.879 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:38.881 143779 DEBUG oslo_concurrency.lockutils [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:38.881 143781 DEBUG oslo_concurrency.lockutils [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:38.881 143781 DEBUG oslo_concurrency.lockutils [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:38.881 143779 DEBUG oslo_concurrency.lockutils [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:38.882 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:38.882 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:38.882 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:38.882 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:38.882 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:38.882 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:38.882 143780 DEBUG oslo_concurrency.lockutils [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:38.882 143780 DEBUG oslo_concurrency.lockutils [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:38.883 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:38.883 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:38.883 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:38.884 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 80d35cf23fb64154b9d2c1db436e39f6 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:00:38.884 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:38.885 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 80d35cf23fb64154b9d2c1db436e39f6 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:00:38.885 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:38.885 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:38.887 143787 DEBUG oslo_concurrency.lockutils [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.update_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:38.887 143787 DEBUG oslo_concurrency.lockutils [req-059639b5-1954-44f9-8e8a-a4dd6c297a6f 97736e1228004f4299f837731b9fa84d a998440975ac4fd8a9c6c20063967804 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.update_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:38.888 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:38.888 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:38.888 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:39.883 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:39.883 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:39.883 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:39.883 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:39.884 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:39.884 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:39.884 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:39.885 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:39.885 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:39.889 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:39.890 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:39.890 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:41.884 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:41.884 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:41.884 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:41.886 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:41.886 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:41.886 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:41.886 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:41.886 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:41.887 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:41.892 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:41.892 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:41.893 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:45.888 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:45.888 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:45.888 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:45.889 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:45.889 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:45.889 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:45.890 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:45.891 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:45.891 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:45.897 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:45.898 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:45.898 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:48.045 143780 DEBUG oslo_service.periodic_task [req-9137505b-80fa-415d-a085-0178bcc81683 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:00:48.052 143780 DEBUG oslo_concurrency.lockutils [req-3b753941-43a5-4fe5-9f0b-dc43bce744af - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:48.052 143780 DEBUG oslo_concurrency.lockutils [req-3b753941-43a5-4fe5-9f0b-dc43bce744af - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:53.894 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:53.895 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:53.895 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:53.897 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:53.897 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:53.897 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:53.897 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:53.898 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:53.898 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:53.904 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:00:53.904 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:00:53.904 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:00:54.075 143779 DEBUG oslo_service.periodic_task [req-893745f5-9dc2-47d1-9cce-cdfdc6775be5 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:00:54.080 143779 DEBUG oslo_concurrency.lockutils [req-6934dd40-9525-428d-86b5-338af359fda9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:54.080 143779 DEBUG oslo_concurrency.lockutils [req-6934dd40-9525-428d-86b5-338af359fda9 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:00:59.063 143781 DEBUG oslo_service.periodic_task [req-144b4892-41bb-4a25-a86e-2d66f2f0c17d - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:00:59.067 143781 DEBUG oslo_concurrency.lockutils [req-e0e4127e-6767-4f59-b3b1-d723b59dd5df - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:00:59.068 143781 DEBUG oslo_concurrency.lockutils [req-e0e4127e-6767-4f59-b3b1-d723b59dd5df - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:01:06.119 143787 DEBUG oslo_service.periodic_task [req-d0d27435-52f5-4c34-9656-9453707d6c5b - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:01:06.122 143787 DEBUG oslo_concurrency.lockutils [req-03060ff0-492b-46eb-8d00-7c09f60d6c86 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:01:06.123 143787 DEBUG oslo_concurrency.lockutils [req-03060ff0-492b-46eb-8d00-7c09f60d6c86 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:01:09.896 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:09.897 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:09.897 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:09.899 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:09.899 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:09.899 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:09.899 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:09.900 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:09.900 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:09.905 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:09.906 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:09.906 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:10.810 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 0319dd4890a94c16ac783e936c30f171 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:01:10.810 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 0319dd4890a94c16ac783e936c30f171 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:01:10.810 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 0319dd4890a94c16ac783e936c30f171 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:01:10.810 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:10.810 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:10.810 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:10.810 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 0319dd4890a94c16ac783e936c30f171 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:01:10.810 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 0319dd4890a94c16ac783e936c30f171 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:01:10.810 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 0319dd4890a94c16ac783e936c30f171 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:01:10.810 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:10.811 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:10.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:10.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:10.811 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:10.811 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:10.811 143780 DEBUG oslo_concurrency.lockutils [req-5b0d9b99-bd35-4943-8510-8e4d2a1c8cc2 a8d9cf51aeed455f997d95cf7560a6af 9445b7beaa88459f85f110826a6a5098 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:01:10.811 143787 DEBUG oslo_concurrency.lockutils [req-5b0d9b99-bd35-4943-8510-8e4d2a1c8cc2 a8d9cf51aeed455f997d95cf7560a6af 9445b7beaa88459f85f110826a6a5098 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:01:10.811 143780 DEBUG oslo_concurrency.lockutils [req-5b0d9b99-bd35-4943-8510-8e4d2a1c8cc2 a8d9cf51aeed455f997d95cf7560a6af 9445b7beaa88459f85f110826a6a5098 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:01:10.811 143779 DEBUG oslo_concurrency.lockutils [req-5b0d9b99-bd35-4943-8510-8e4d2a1c8cc2 a8d9cf51aeed455f997d95cf7560a6af 9445b7beaa88459f85f110826a6a5098 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:01:10.812 143787 DEBUG oslo_concurrency.lockutils [req-5b0d9b99-bd35-4943-8510-8e4d2a1c8cc2 a8d9cf51aeed455f997d95cf7560a6af 9445b7beaa88459f85f110826a6a5098 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:01:10.812 143779 DEBUG oslo_concurrency.lockutils [req-5b0d9b99-bd35-4943-8510-8e4d2a1c8cc2 a8d9cf51aeed455f997d95cf7560a6af 9445b7beaa88459f85f110826a6a5098 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:01:10.812 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: 0319dd4890a94c16ac783e936c30f171 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:01:10.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:10.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:10.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: 0319dd4890a94c16ac783e936c30f171 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:01:10.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:10.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:10.813 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:10.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:10.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:10.813 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:10.813 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:10.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:10.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:10.813 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:10.814 143781 DEBUG oslo_concurrency.lockutils [req-5b0d9b99-bd35-4943-8510-8e4d2a1c8cc2 a8d9cf51aeed455f997d95cf7560a6af 9445b7beaa88459f85f110826a6a5098 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:01:10.814 143781 DEBUG oslo_concurrency.lockutils [req-5b0d9b99-bd35-4943-8510-8e4d2a1c8cc2 a8d9cf51aeed455f997d95cf7560a6af 9445b7beaa88459f85f110826a6a5098 - 6b4d6e1f90944f88b6d38eafd30f73c5 6b4d6e1f90944f88b6d38eafd30f73c5] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.delete_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:01:10.815 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:10.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:10.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:11.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:11.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:11.814 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:11.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:11.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:11.815 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:11.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:11.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:11.815 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:11.816 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:11.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:11.817 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:13.816 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:13.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:13.817 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:13.817 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:13.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:13.818 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:13.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:13.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:13.818 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:13.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:13.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:13.820 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:17.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:17.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:17.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:17.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:17.820 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:17.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:17.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:17.820 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:17.820 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:17.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:17.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:17.822 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:19.044 143780 DEBUG oslo_service.periodic_task [req-3b753941-43a5-4fe5-9f0b-dc43bce744af - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:01:19.048 143780 DEBUG oslo_concurrency.lockutils [req-85a66e42-99f4-4bdd-82cf-72ddf9db042a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:01:19.049 143780 DEBUG oslo_concurrency.lockutils [req-85a66e42-99f4-4bdd-82cf-72ddf9db042a - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:01:24.092 143779 DEBUG oslo_service.periodic_task [req-6934dd40-9525-428d-86b5-338af359fda9 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:01:24.096 143779 DEBUG oslo_concurrency.lockutils [req-dddc96ab-3cde-4fad-a3fa-fb5a1538dfa3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:01:24.097 143779 DEBUG oslo_concurrency.lockutils [req-dddc96ab-3cde-4fad-a3fa-fb5a1538dfa3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:01:25.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:25.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:25.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:25.822 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:25.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:25.822 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:25.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:25.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:25.823 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:25.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:25.824 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:25.825 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:30.041 143781 DEBUG oslo_service.periodic_task [req-e0e4127e-6767-4f59-b3b1-d723b59dd5df - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:01:30.046 143781 DEBUG oslo_concurrency.lockutils [req-9527fb82-5d22-4d20-91b6-e14cc462d89f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:01:30.046 143781 DEBUG oslo_concurrency.lockutils [req-9527fb82-5d22-4d20-91b6-e14cc462d89f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:01:37.083 143787 DEBUG oslo_service.periodic_task [req-03060ff0-492b-46eb-8d00-7c09f60d6c86 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:01:37.087 143787 DEBUG oslo_concurrency.lockutils [req-d4a8c920-c0da-4797-bae1-a6e641e29a3c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:01:37.087 143787 DEBUG oslo_concurrency.lockutils [req-d4a8c920-c0da-4797-bae1-a6e641e29a3c - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:01:41.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:41.824 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:41.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:41.824 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:41.824 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:41.824 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:41.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:41.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:41.825 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:41.826 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:01:41.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:01:41.827 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:01:50.044 143780 DEBUG oslo_service.periodic_task [req-85a66e42-99f4-4bdd-82cf-72ddf9db042a - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:01:50.050 143780 DEBUG oslo_concurrency.lockutils [req-b52d7603-fcb0-4860-9b55-03ba16cebcb3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:01:50.050 143780 DEBUG oslo_concurrency.lockutils [req-b52d7603-fcb0-4860-9b55-03ba16cebcb3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:01:55.058 143779 DEBUG oslo_service.periodic_task [req-dddc96ab-3cde-4fad-a3fa-fb5a1538dfa3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:01:55.064 143779 DEBUG oslo_concurrency.lockutils [req-46b628ed-51b0-485d-979a-e522bde43657 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:01:55.066 143779 DEBUG oslo_concurrency.lockutils [req-46b628ed-51b0-485d-979a-e522bde43657 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:02:00.054 143781 DEBUG oslo_service.periodic_task [req-9527fb82-5d22-4d20-91b6-e14cc462d89f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:02:00.059 143781 DEBUG oslo_concurrency.lockutils [req-b2a5e6ce-2ae9-4828-9440-e432e8c56b08 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:02:00.059 143781 DEBUG oslo_concurrency.lockutils [req-b2a5e6ce-2ae9-4828-9440-e432e8c56b08 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:02:08.083 143787 DEBUG oslo_service.periodic_task [req-d4a8c920-c0da-4797-bae1-a6e641e29a3c - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:02:08.087 143787 DEBUG oslo_concurrency.lockutils [req-faeeb16a-3583-4331-9ce2-e96b0b452cdc - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:02:08.087 143787 DEBUG oslo_concurrency.lockutils [req-faeeb16a-3583-4331-9ce2-e96b0b452cdc - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:02:20.061 143780 DEBUG oslo_service.periodic_task [req-b52d7603-fcb0-4860-9b55-03ba16cebcb3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:02:20.066 143780 DEBUG oslo_concurrency.lockutils [req-a1b6fef1-ee9b-4173-95e4-eb4f2eaf0b3f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:02:20.066 143780 DEBUG oslo_concurrency.lockutils [req-a1b6fef1-ee9b-4173-95e4-eb4f2eaf0b3f - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:02:23.188 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:02:23.188 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:23.188 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:23.190 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:02:23.190 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:23.190 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:23.206 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:02:23.206 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:23.207 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:23.233 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:02:23.234 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:23.234 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:26.057 143779 DEBUG oslo_service.periodic_task [req-46b628ed-51b0-485d-979a-e522bde43657 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:02:26.061 143779 DEBUG oslo_concurrency.lockutils [req-28c16ed4-c88b-49f0-a2ec-23b9b24df7d3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:02:26.062 143779 DEBUG oslo_concurrency.lockutils [req-28c16ed4-c88b-49f0-a2ec-23b9b24df7d3 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:02:30.066 143781 DEBUG oslo_service.periodic_task [req-b2a5e6ce-2ae9-4828-9440-e432e8c56b08 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:02:30.070 143781 DEBUG oslo_concurrency.lockutils [req-69e47f65-2047-463d-a85f-4d5cf2ec4557 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:02:30.070 143781 DEBUG oslo_concurrency.lockutils [req-69e47f65-2047-463d-a85f-4d5cf2ec4557 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:02:30.611 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: a907919cabef4f3fb9fc42e860ac98e8 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:02:30.611 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: a907919cabef4f3fb9fc42e860ac98e8 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:02:30.611 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: a907919cabef4f3fb9fc42e860ac98e8 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:02:30.611 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] received message with unique_id: a907919cabef4f3fb9fc42e860ac98e8 __call__ /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:324 2026-04-02 02:02:30.611 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:30.611 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:30.611 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:30.611 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:30.611 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: a907919cabef4f3fb9fc42e860ac98e8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:02:30.611 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: a907919cabef4f3fb9fc42e860ac98e8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:02:30.611 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: a907919cabef4f3fb9fc42e860ac98e8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:02:30.611 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Poll the incoming message with unique_id: a907919cabef4f3fb9fc42e860ac98e8 poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:346 2026-04-02 02:02:30.611 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:30.611 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:30.611 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:30.611 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:30.611 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:30.612 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:30.612 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:30.612 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:30.612 143781 DEBUG oslo_concurrency.lockutils [req-814d5164-aad8-4208-a101-c99491a0490d - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:02:30.612 143780 DEBUG oslo_concurrency.lockutils [req-814d5164-aad8-4208-a101-c99491a0490d - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:02:30.612 143779 DEBUG oslo_concurrency.lockutils [req-814d5164-aad8-4208-a101-c99491a0490d - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:02:30.612 143787 DEBUG oslo_concurrency.lockutils [req-814d5164-aad8-4208-a101-c99491a0490d - - - - -] Lock "host_instance" acquired by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:02:30.612 143781 DEBUG nova.scheduler.host_manager [req-814d5164-aad8-4208-a101-c99491a0490d - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 02:02:30.612 143780 DEBUG nova.scheduler.host_manager [req-814d5164-aad8-4208-a101-c99491a0490d - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 02:02:30.612 143779 DEBUG nova.scheduler.host_manager [req-814d5164-aad8-4208-a101-c99491a0490d - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 02:02:30.612 143781 DEBUG oslo_concurrency.lockutils [req-814d5164-aad8-4208-a101-c99491a0490d - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:02:30.612 143787 DEBUG nova.scheduler.host_manager [req-814d5164-aad8-4208-a101-c99491a0490d - - - - -] Successfully synced instances from host 'cn-jenkins-deploy-platform-juju-os-690-1'. sync_instance_info /usr/lib/python3/dist-packages/nova/scheduler/host_manager.py:956 2026-04-02 02:02:30.613 143780 DEBUG oslo_concurrency.lockutils [req-814d5164-aad8-4208-a101-c99491a0490d - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:02:30.613 143779 DEBUG oslo_concurrency.lockutils [req-814d5164-aad8-4208-a101-c99491a0490d - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:02:30.613 143787 DEBUG oslo_concurrency.lockutils [req-814d5164-aad8-4208-a101-c99491a0490d - - - - -] Lock "host_instance" "released" by "nova.scheduler.host_manager.HostManager.sync_instance_info" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:02:30.613 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:02:30.613 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:30.613 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:30.614 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:02:30.614 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:02:30.614 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:30.614 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:30.614 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:30.614 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:30.615 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:02:30.615 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:30.615 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:31.614 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:02:31.614 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:31.614 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:31.615 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:02:31.615 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:31.615 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:31.616 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:02:31.616 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:31.616 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:31.616 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:02:31.617 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:31.617 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:33.616 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:02:33.616 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:33.616 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:33.617 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:02:33.617 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:02:33.618 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:33.618 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:33.618 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:33.618 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:33.618 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:02:33.618 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:33.618 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:37.618 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:02:37.618 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:37.618 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:37.620 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:02:37.620 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:02:37.620 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:37.620 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:37.620 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:37.620 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:37.621 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:02:37.621 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:37.621 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:38.092 143787 DEBUG oslo_service.periodic_task [req-faeeb16a-3583-4331-9ce2-e96b0b452cdc - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:02:38.096 143787 DEBUG oslo_concurrency.lockutils [req-7bcba751-223a-446b-b279-23b3d2b3f620 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:02:38.097 143787 DEBUG oslo_concurrency.lockutils [req-7bcba751-223a-446b-b279-23b3d2b3f620 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:02:45.621 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:02:45.622 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:45.622 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:45.622 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:02:45.622 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:45.623 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:45.627 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:02:45.627 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:45.627 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:45.628 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:02:45.629 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:02:45.629 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:02:51.044 143780 DEBUG oslo_service.periodic_task [req-a1b6fef1-ee9b-4173-95e4-eb4f2eaf0b3f - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:02:51.049 143780 DEBUG oslo_concurrency.lockutils [req-6843d97b-7d87-4ad5-80af-5195926bf604 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:02:51.049 143780 DEBUG oslo_concurrency.lockutils [req-6843d97b-7d87-4ad5-80af-5195926bf604 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:02:57.057 143779 DEBUG oslo_service.periodic_task [req-28c16ed4-c88b-49f0-a2ec-23b9b24df7d3 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:02:57.061 143779 DEBUG oslo_concurrency.lockutils [req-0f36969d-ca73-4db9-82b1-f229e0420a19 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:02:57.061 143779 DEBUG oslo_concurrency.lockutils [req-0f36969d-ca73-4db9-82b1-f229e0420a19 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:03:01.041 143781 DEBUG oslo_service.periodic_task [req-69e47f65-2047-463d-a85f-4d5cf2ec4557 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:03:01.046 143781 DEBUG oslo_concurrency.lockutils [req-5f11729d-a32f-4113-acbb-ebc5e1d83dbf - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:03:01.046 143781 DEBUG oslo_concurrency.lockutils [req-5f11729d-a32f-4113-acbb-ebc5e1d83dbf - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:03:01.624 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:03:01.624 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:03:01.624 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:03:01.625 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:03:01.625 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:03:01.625 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:03:01.629 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:03:01.629 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:03:01.630 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:03:01.631 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:03:01.631 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:03:01.631 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:03:09.083 143787 DEBUG oslo_service.periodic_task [req-7bcba751-223a-446b-b279-23b3d2b3f620 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:03:09.088 143787 DEBUG oslo_concurrency.lockutils [req-7d0e1dec-f732-412d-af7f-1d03a160db85 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:03:09.088 143787 DEBUG oslo_concurrency.lockutils [req-7d0e1dec-f732-412d-af7f-1d03a160db85 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:03:21.057 143780 DEBUG oslo_service.periodic_task [req-6843d97b-7d87-4ad5-80af-5195926bf604 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:03:21.061 143780 DEBUG oslo_concurrency.lockutils [req-d22e0705-b1b1-46b6-9484-c4dfd29f1fde - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:03:21.062 143780 DEBUG oslo_concurrency.lockutils [req-d22e0705-b1b1-46b6-9484-c4dfd29f1fde - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:03:27.066 143779 DEBUG oslo_service.periodic_task [req-0f36969d-ca73-4db9-82b1-f229e0420a19 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:03:27.071 143779 DEBUG oslo_concurrency.lockutils [req-5926c566-bef6-48c9-9364-130f34a4ea0e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:03:27.071 143779 DEBUG oslo_concurrency.lockutils [req-5926c566-bef6-48c9-9364-130f34a4ea0e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:03:32.042 143781 DEBUG oslo_service.periodic_task [req-5f11729d-a32f-4113-acbb-ebc5e1d83dbf - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:03:32.046 143781 DEBUG oslo_concurrency.lockutils [req-2ba78ea5-c089-47b8-8a2d-8d67300cfa91 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:03:32.046 143781 DEBUG oslo_concurrency.lockutils [req-2ba78ea5-c089-47b8-8a2d-8d67300cfa91 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:03:33.626 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:03:33.627 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:03:33.627 143781 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:03:33.629 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:03:33.630 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:03:33.630 143780 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:03:33.631 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:03:33.631 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:03:33.631 143787 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:03:33.633 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection timeout poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:360 2026-04-02 02:03:33.634 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] Listener is running poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:343 2026-04-02 02:03:33.634 143779 DEBUG oslo_messaging._drivers.amqpdriver [-] AMQPListener connection consume poll /usr/lib/python3/dist-packages/oslo_messaging/_drivers/amqpdriver.py:357 2026-04-02 02:03:39.094 143787 DEBUG oslo_service.periodic_task [req-7d0e1dec-f732-412d-af7f-1d03a160db85 - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:03:39.099 143787 DEBUG oslo_concurrency.lockutils [req-37466a6c-16cf-48dc-b9b6-a97d3c5e506e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:03:39.099 143787 DEBUG oslo_concurrency.lockutils [req-37466a6c-16cf-48dc-b9b6-a97d3c5e506e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:03:52.044 143780 DEBUG oslo_service.periodic_task [req-d22e0705-b1b1-46b6-9484-c4dfd29f1fde - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:03:52.049 143780 DEBUG oslo_concurrency.lockutils [req-b2673b28-8bab-45ad-ad5c-50ecd075ce45 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:03:52.049 143780 DEBUG oslo_concurrency.lockutils [req-b2673b28-8bab-45ad-ad5c-50ecd075ce45 - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400 2026-04-02 02:03:57.076 143779 DEBUG oslo_service.periodic_task [req-5926c566-bef6-48c9-9364-130f34a4ea0e - - - - -] Running periodic task SchedulerManager._discover_hosts_in_cells run_periodic_tasks /usr/lib/python3/dist-packages/oslo_service/periodic_task.py:210 2026-04-02 02:03:57.080 143779 DEBUG oslo_concurrency.lockutils [req-b4ae0283-b0c1-4050-90cb-66c75cfb7f0e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:386 2026-04-02 02:03:57.080 143779 DEBUG oslo_concurrency.lockutils [req-b4ae0283-b0c1-4050-90cb-66c75cfb7f0e - - - - -] Lock "93adf21c-64ef-44ce-b3d6-e2add9f3b92e" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python3/dist-packages/oslo_concurrency/lockutils.py:400