Apr 22 15:55:59.839640 ip-10-0-135-9 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 15:55:59.839654 ip-10-0-135-9 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 15:55:59.839664 ip-10-0-135-9 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 15:55:59.840012 ip-10-0-135-9 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 15:56:10.032704 ip-10-0-135-9 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 15:56:10.032721 ip-10-0-135-9 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot e427532d116c428dabbf10567d7ddb21 -- Apr 22 15:58:38.994549 ip-10-0-135-9 systemd[1]: Starting Kubernetes Kubelet... Apr 22 15:58:39.392458 ip-10-0-135-9 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:58:39.392458 ip-10-0-135-9 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 15:58:39.392458 ip-10-0-135-9 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:58:39.392458 ip-10-0-135-9 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 15:58:39.392458 ip-10-0-135-9 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:58:39.393234 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.393124 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 15:58:39.397387 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397358 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:58:39.397387 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397382 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:58:39.397387 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397387 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:58:39.397387 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397391 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:58:39.397387 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397395 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:58:39.397712 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397399 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:58:39.397712 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397403 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:58:39.397712 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397409 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:58:39.397712 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397413 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:58:39.397712 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397416 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:58:39.397712 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397420 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:58:39.397712 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397424 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:58:39.397712 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397428 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:58:39.397712 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397431 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:58:39.397712 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397434 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:58:39.397712 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397438 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:58:39.397712 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397442 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:58:39.397712 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397446 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:58:39.397712 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397450 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:58:39.397712 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397454 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:58:39.397712 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397461 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:58:39.397712 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397465 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:58:39.397712 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397469 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:58:39.397712 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397472 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:58:39.397712 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397476 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:58:39.398536 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397479 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:58:39.398536 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397482 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:58:39.398536 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397487 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:58:39.398536 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397490 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:58:39.398536 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397494 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:58:39.398536 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397498 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:58:39.398536 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397502 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:58:39.398536 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397506 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:58:39.398536 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397509 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:58:39.398536 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397513 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:58:39.398536 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397516 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:58:39.398536 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397521 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:58:39.398536 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397524 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:58:39.398536 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397529 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:58:39.398536 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397536 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:58:39.398536 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397545 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:58:39.398536 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397549 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:58:39.398536 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397555 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:58:39.398536 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397560 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:58:39.399366 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397565 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:58:39.399366 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397569 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:58:39.399366 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397574 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:58:39.399366 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397578 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:58:39.399366 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397582 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:58:39.399366 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397587 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:58:39.399366 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397592 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:58:39.399366 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397597 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:58:39.399366 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397601 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:58:39.399366 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397606 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:58:39.399366 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397611 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:58:39.399366 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397615 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:58:39.399366 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397620 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:58:39.399366 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397624 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:58:39.399366 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397628 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:58:39.399366 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397632 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:58:39.399366 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397636 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:58:39.399366 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397642 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:58:39.399366 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397649 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:58:39.399366 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397653 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:58:39.400232 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397658 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:58:39.400232 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397662 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:58:39.400232 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397668 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:58:39.400232 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397672 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:58:39.400232 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397676 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:58:39.400232 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397681 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:58:39.400232 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397687 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:58:39.400232 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397691 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:58:39.400232 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397696 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:58:39.400232 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397700 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:58:39.400232 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397704 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:58:39.400232 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397708 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:58:39.400232 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397711 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:58:39.400232 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397715 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:58:39.400232 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397721 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:58:39.400232 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397725 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:58:39.400232 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397729 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:58:39.400232 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397733 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:58:39.400232 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397737 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:58:39.400232 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397742 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:58:39.400869 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397746 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:58:39.400869 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.397750 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:58:39.400869 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398430 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:58:39.400869 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398442 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:58:39.400869 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398446 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:58:39.400869 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398451 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:58:39.400869 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398455 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:58:39.400869 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398459 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:58:39.400869 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398464 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:58:39.400869 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398468 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:58:39.400869 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398472 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:58:39.400869 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398477 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:58:39.400869 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398481 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:58:39.400869 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398486 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:58:39.400869 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398490 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:58:39.400869 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398494 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:58:39.400869 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398498 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:58:39.400869 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398503 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:58:39.400869 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398508 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:58:39.400869 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398513 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:58:39.401560 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398517 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:58:39.401560 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398522 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:58:39.401560 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398528 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:58:39.401560 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398535 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:58:39.401560 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398539 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:58:39.401560 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398544 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:58:39.401560 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398550 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:58:39.401560 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398554 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:58:39.401560 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398558 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:58:39.401560 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398562 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:58:39.401560 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398566 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:58:39.401560 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398570 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:58:39.401560 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398575 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:58:39.401560 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398579 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:58:39.401560 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398584 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:58:39.401560 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398588 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:58:39.401560 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398592 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:58:39.401560 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398596 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:58:39.401560 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398600 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:58:39.402130 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398604 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:58:39.402130 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398608 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:58:39.402130 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398612 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:58:39.402130 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398616 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:58:39.402130 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398620 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:58:39.402130 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398625 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:58:39.402130 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398629 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:58:39.402130 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398633 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:58:39.402130 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398638 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:58:39.402130 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398641 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:58:39.402130 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398645 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:58:39.402130 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398651 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:58:39.402130 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398657 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:58:39.402130 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398662 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:58:39.402130 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398666 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:58:39.402130 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398670 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:58:39.402130 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398674 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:58:39.402130 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398679 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:58:39.402130 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398683 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:58:39.402130 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398688 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:58:39.402714 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398692 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:58:39.402714 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398697 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:58:39.402714 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398701 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:58:39.402714 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398705 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:58:39.402714 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398710 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:58:39.402714 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398717 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:58:39.402714 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398723 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:58:39.402714 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398728 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:58:39.402714 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398732 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:58:39.402714 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398736 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:58:39.402714 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398740 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:58:39.402714 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398744 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:58:39.402714 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398748 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:58:39.402714 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398752 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:58:39.402714 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398756 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:58:39.402714 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398760 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:58:39.402714 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398764 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:58:39.402714 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398768 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:58:39.402714 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398772 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:58:39.402714 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398776 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:58:39.403273 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398780 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:58:39.403273 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398785 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:58:39.403273 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398789 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:58:39.403273 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398793 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:58:39.403273 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398797 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:58:39.403273 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398801 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:58:39.403273 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398805 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:58:39.403273 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398809 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:58:39.403273 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.398814 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:58:39.403273 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.398925 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 15:58:39.403273 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.398936 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 15:58:39.403273 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.398946 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 15:58:39.403273 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.398953 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 15:58:39.403273 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.398968 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 15:58:39.403273 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.398974 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 15:58:39.403273 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.398981 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 15:58:39.403273 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.398988 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 15:58:39.403273 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.398994 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 15:58:39.403273 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.398999 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 15:58:39.403273 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399004 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 15:58:39.403273 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399009 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399014 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399019 2572 flags.go:64] FLAG: --cgroup-root="" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399024 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399028 2572 flags.go:64] FLAG: --client-ca-file="" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399033 2572 flags.go:64] FLAG: --cloud-config="" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399037 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399042 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399049 2572 flags.go:64] FLAG: --cluster-domain="" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399054 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399059 2572 flags.go:64] FLAG: --config-dir="" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399064 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399069 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399075 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399080 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399085 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399090 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399095 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399099 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399104 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399109 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399114 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399121 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399125 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399130 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 15:58:39.404107 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399137 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399143 2572 flags.go:64] FLAG: --enable-server="true" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399149 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399155 2572 flags.go:64] FLAG: --event-burst="100" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399167 2572 flags.go:64] FLAG: --event-qps="50" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399172 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399177 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399182 2572 flags.go:64] FLAG: --eviction-hard="" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399209 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399215 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399220 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399225 2572 flags.go:64] FLAG: --eviction-soft="" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399229 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399234 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399238 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399243 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399247 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399252 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399256 2572 flags.go:64] FLAG: --feature-gates="" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399263 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399268 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399273 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399279 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399284 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399290 2572 flags.go:64] FLAG: --help="false" Apr 22 15:58:39.405237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399294 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-135-9.ec2.internal" Apr 22 15:58:39.406269 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399299 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 15:58:39.406269 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399304 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 15:58:39.406269 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399309 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 15:58:39.406269 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399314 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 15:58:39.406269 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399320 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 15:58:39.406269 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399324 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 15:58:39.406269 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399331 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 15:58:39.406269 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399336 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 15:58:39.406269 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399342 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 15:58:39.406269 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399346 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 15:58:39.406269 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399352 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 15:58:39.406269 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399356 2572 flags.go:64] FLAG: --kube-reserved="" Apr 22 15:58:39.406269 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399361 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 15:58:39.406269 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399365 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 15:58:39.406269 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399370 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 15:58:39.406269 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399374 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 15:58:39.406269 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399378 2572 flags.go:64] FLAG: --lock-file="" Apr 22 15:58:39.406269 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399382 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 15:58:39.406269 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399387 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 15:58:39.406269 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399393 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 15:58:39.406269 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399402 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 15:58:39.406269 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399407 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 15:58:39.406269 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399412 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 15:58:39.406269 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399417 2572 flags.go:64] FLAG: --logging-format="text" Apr 22 15:58:39.406969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399421 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 15:58:39.406969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399427 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 15:58:39.406969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399431 2572 flags.go:64] FLAG: --manifest-url="" Apr 22 15:58:39.406969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399435 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 22 15:58:39.406969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399442 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 15:58:39.406969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399448 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 15:58:39.406969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399454 2572 flags.go:64] FLAG: --max-pods="110" Apr 22 15:58:39.406969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399459 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 15:58:39.406969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399464 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 15:58:39.406969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399468 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 15:58:39.406969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399473 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 15:58:39.406969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399477 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 15:58:39.406969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399482 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 15:58:39.406969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399487 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 15:58:39.406969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399501 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 15:58:39.406969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399506 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 15:58:39.406969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399511 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 15:58:39.406969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399516 2572 flags.go:64] FLAG: --pod-cidr="" Apr 22 15:58:39.406969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399521 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 15:58:39.406969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399529 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 15:58:39.406969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399534 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 15:58:39.406969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399539 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 22 15:58:39.406969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399544 2572 flags.go:64] FLAG: --port="10250" Apr 22 15:58:39.406969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399548 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399553 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-03fcd892704318cd2" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399558 2572 flags.go:64] FLAG: --qos-reserved="" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399567 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399572 2572 flags.go:64] FLAG: --register-node="true" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399577 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399581 2572 flags.go:64] FLAG: --register-with-taints="" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399587 2572 flags.go:64] FLAG: --registry-burst="10" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399592 2572 flags.go:64] FLAG: --registry-qps="5" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399597 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399601 2572 flags.go:64] FLAG: --reserved-memory="" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399607 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399612 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399617 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399621 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399626 2572 flags.go:64] FLAG: --runonce="false" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399630 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399635 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399639 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399644 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399649 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399653 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399659 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399665 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399670 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399675 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 15:58:39.407573 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399679 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 15:58:39.408187 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399686 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 15:58:39.408187 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399691 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 15:58:39.408187 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399696 2572 flags.go:64] FLAG: --system-cgroups="" Apr 22 15:58:39.408187 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399701 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 15:58:39.408187 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399711 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 15:58:39.408187 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399715 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 22 15:58:39.408187 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399720 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 15:58:39.408187 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399727 2572 flags.go:64] FLAG: --tls-min-version="" Apr 22 15:58:39.408187 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399733 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 15:58:39.408187 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399738 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 15:58:39.408187 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399743 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 15:58:39.408187 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399747 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 15:58:39.408187 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399752 2572 flags.go:64] FLAG: --v="2" Apr 22 15:58:39.408187 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399759 2572 flags.go:64] FLAG: --version="false" Apr 22 15:58:39.408187 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399765 2572 flags.go:64] FLAG: --vmodule="" Apr 22 15:58:39.408187 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399771 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 15:58:39.408187 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.399777 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 15:58:39.408187 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.399973 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:58:39.408187 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.399981 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:58:39.408187 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.399986 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:58:39.408187 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.399991 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:58:39.408187 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.399996 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:58:39.408187 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400002 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:58:39.408807 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400008 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:58:39.408807 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400014 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:58:39.408807 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400019 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:58:39.408807 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400025 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:58:39.408807 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400030 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:58:39.408807 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400037 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:58:39.408807 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400041 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:58:39.408807 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400045 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:58:39.408807 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400050 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:58:39.408807 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400054 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:58:39.408807 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400059 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:58:39.408807 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400063 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:58:39.408807 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400067 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:58:39.408807 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400071 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:58:39.408807 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400075 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:58:39.408807 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400080 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:58:39.408807 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400083 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:58:39.408807 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400089 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:58:39.408807 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400094 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:58:39.408807 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400098 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:58:39.409333 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400102 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:58:39.409333 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400106 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:58:39.409333 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400111 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:58:39.409333 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400114 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:58:39.409333 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400118 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:58:39.409333 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400122 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:58:39.409333 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400128 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:58:39.409333 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400134 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:58:39.409333 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400138 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:58:39.409333 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400142 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:58:39.409333 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400146 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:58:39.409333 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400150 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:58:39.409333 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400155 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:58:39.409333 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400159 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:58:39.409333 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400163 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:58:39.409333 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400167 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:58:39.409333 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400171 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:58:39.409333 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400177 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:58:39.409333 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400181 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:58:39.409799 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400186 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:58:39.409799 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400211 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:58:39.409799 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400217 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:58:39.409799 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400221 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:58:39.409799 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400226 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:58:39.409799 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400229 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:58:39.409799 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400234 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:58:39.409799 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400238 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:58:39.409799 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400241 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:58:39.409799 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400245 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:58:39.409799 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400251 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:58:39.409799 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400255 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:58:39.409799 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400259 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:58:39.409799 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400263 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:58:39.409799 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400267 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:58:39.409799 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400271 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:58:39.409799 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400275 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:58:39.409799 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400279 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:58:39.409799 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400284 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:58:39.409799 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400288 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:58:39.410395 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400292 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:58:39.410395 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400296 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:58:39.410395 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400300 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:58:39.410395 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400304 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:58:39.410395 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400308 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:58:39.410395 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400312 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:58:39.410395 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400316 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:58:39.410395 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400320 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:58:39.410395 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400324 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:58:39.410395 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400329 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:58:39.410395 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400335 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:58:39.410395 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400339 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:58:39.410395 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400344 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:58:39.410395 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400348 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:58:39.410395 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400351 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:58:39.410395 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400355 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:58:39.410395 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400359 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:58:39.410395 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400364 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:58:39.410395 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400368 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:58:39.410395 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400373 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:58:39.410882 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.400377 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:58:39.410882 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.400385 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:58:39.410882 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.408897 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 15:58:39.410882 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.408919 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 15:58:39.410882 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.408970 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:58:39.410882 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.408977 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:58:39.410882 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.408980 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:58:39.410882 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.408984 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:58:39.410882 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.408987 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:58:39.410882 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.408989 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:58:39.410882 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.408992 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:58:39.410882 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.408995 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:58:39.410882 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.408998 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:58:39.410882 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409001 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:58:39.410882 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409003 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:58:39.410882 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409006 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:58:39.411304 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409009 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:58:39.411304 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409011 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:58:39.411304 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409014 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:58:39.411304 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409016 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:58:39.411304 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409020 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:58:39.411304 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409022 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:58:39.411304 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409025 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:58:39.411304 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409029 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:58:39.411304 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409032 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:58:39.411304 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409034 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:58:39.411304 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409037 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:58:39.411304 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409039 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:58:39.411304 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409042 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:58:39.411304 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409044 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:58:39.411304 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409047 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:58:39.411304 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409049 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:58:39.411304 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409052 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:58:39.411304 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409055 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:58:39.411304 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409057 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:58:39.411304 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409062 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:58:39.411793 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409065 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:58:39.411793 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409068 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:58:39.411793 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409070 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:58:39.411793 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409073 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:58:39.411793 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409075 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:58:39.411793 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409078 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:58:39.411793 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409080 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:58:39.411793 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409083 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:58:39.411793 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409086 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:58:39.411793 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409088 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:58:39.411793 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409091 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:58:39.411793 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409093 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:58:39.411793 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409095 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:58:39.411793 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409098 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:58:39.411793 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409100 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:58:39.411793 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409103 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:58:39.411793 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409105 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:58:39.411793 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409108 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:58:39.411793 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409111 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:58:39.411793 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409113 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:58:39.412346 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409116 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:58:39.412346 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409118 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:58:39.412346 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409122 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:58:39.412346 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409126 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:58:39.412346 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409129 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:58:39.412346 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409131 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:58:39.412346 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409134 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:58:39.412346 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409136 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:58:39.412346 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409139 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:58:39.412346 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409141 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:58:39.412346 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409144 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:58:39.412346 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409147 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:58:39.412346 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409151 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:58:39.412346 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409153 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:58:39.412346 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409155 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:58:39.412346 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409158 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:58:39.412346 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409160 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:58:39.412346 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409163 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:58:39.412346 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409165 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:58:39.412817 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409168 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:58:39.412817 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409171 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:58:39.412817 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409173 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:58:39.412817 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409175 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:58:39.412817 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409178 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:58:39.412817 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409182 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:58:39.412817 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409186 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:58:39.412817 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409209 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:58:39.412817 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409212 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:58:39.412817 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409215 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:58:39.412817 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409218 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:58:39.412817 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409221 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:58:39.412817 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409224 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:58:39.412817 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409227 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:58:39.412817 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409230 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:58:39.413208 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.409236 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:58:39.413208 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409363 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:58:39.413208 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409370 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:58:39.413208 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409373 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:58:39.413208 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409376 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:58:39.413208 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409380 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:58:39.413208 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409383 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:58:39.413208 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409385 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:58:39.413208 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409388 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:58:39.413208 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409390 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:58:39.413208 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409393 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:58:39.413208 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409396 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:58:39.413208 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409399 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:58:39.413208 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409402 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:58:39.413208 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409405 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:58:39.413586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409407 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:58:39.413586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409410 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:58:39.413586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409412 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:58:39.413586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409415 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:58:39.413586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409417 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:58:39.413586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409420 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:58:39.413586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409423 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:58:39.413586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409425 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:58:39.413586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409428 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:58:39.413586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409430 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:58:39.413586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409433 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:58:39.413586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409435 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:58:39.413586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409437 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:58:39.413586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409440 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:58:39.413586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409442 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:58:39.413586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409445 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:58:39.413586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409447 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:58:39.413586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409449 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:58:39.413586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409452 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:58:39.413586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409455 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:58:39.414071 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409457 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:58:39.414071 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409459 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:58:39.414071 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409462 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:58:39.414071 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409465 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:58:39.414071 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409467 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:58:39.414071 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409469 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:58:39.414071 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409472 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:58:39.414071 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409474 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:58:39.414071 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409477 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:58:39.414071 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409480 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:58:39.414071 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409483 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:58:39.414071 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409485 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:58:39.414071 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409487 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:58:39.414071 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409490 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:58:39.414071 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409492 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:58:39.414071 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409495 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:58:39.414071 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409497 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:58:39.414071 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409499 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:58:39.414071 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409502 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:58:39.414071 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409505 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:58:39.414586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409509 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:58:39.414586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409512 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:58:39.414586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409514 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:58:39.414586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409517 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:58:39.414586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409519 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:58:39.414586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409521 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:58:39.414586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409524 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:58:39.414586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409527 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:58:39.414586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409529 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:58:39.414586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409531 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:58:39.414586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409534 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:58:39.414586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409537 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:58:39.414586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409539 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:58:39.414586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409542 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:58:39.414586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409544 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:58:39.414586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409546 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:58:39.414586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409549 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:58:39.414586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409551 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:58:39.414586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409554 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:58:39.414586 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409556 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:58:39.415066 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409559 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:58:39.415066 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409561 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:58:39.415066 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409918 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:58:39.415066 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409921 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:58:39.415066 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409924 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:58:39.415066 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409928 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:58:39.415066 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409931 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:58:39.415066 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409934 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:58:39.415066 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409937 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:58:39.415066 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409939 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:58:39.415066 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409942 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:58:39.415066 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:39.409944 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:58:39.415066 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.409949 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:58:39.415066 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.410639 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 15:58:39.415517 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.413398 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 15:58:39.415517 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.414258 2572 server.go:1019] "Starting client certificate rotation" Apr 22 15:58:39.415517 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.414361 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 15:58:39.415517 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.414401 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 15:58:39.434858 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.434834 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 15:58:39.441023 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.440994 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 15:58:39.459293 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.459264 2572 log.go:25] "Validated CRI v1 runtime API" Apr 22 15:58:39.465151 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.465129 2572 log.go:25] "Validated CRI v1 image API" Apr 22 15:58:39.466493 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.466469 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 15:58:39.466493 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.466488 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 15:58:39.470293 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.470267 2572 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 941e1e15-86f8-428a-baf5-4f5673ff22fb:/dev/nvme0n1p3 cde17c37-e469-49d1-8c27-5e7cdb22ed64:/dev/nvme0n1p4] Apr 22 15:58:39.470293 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.470289 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 15:58:39.476663 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.476516 2572 manager.go:217] Machine: {Timestamp:2026-04-22 15:58:39.474752568 +0000 UTC m=+0.374306636 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100315 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2724b29e6bd826a633b5751de9e396 SystemUUID:ec2724b2-9e6b-d826-a633-b5751de9e396 BootID:e427532d-116c-428d-abbf-10567d7ddb21 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:4d:56:38:07:1d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:4d:56:38:07:1d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5e:23:43:86:46:83 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 15:58:39.476663 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.476655 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 15:58:39.476766 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.476749 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 15:58:39.477828 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.477797 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 15:58:39.477979 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.477833 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-9.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 15:58:39.478035 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.477989 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 15:58:39.478035 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.477998 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 15:58:39.478035 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.478012 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 15:58:39.478745 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.478733 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 15:58:39.480052 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.480041 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 22 15:58:39.480182 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.480172 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 15:58:39.482155 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.482143 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 22 15:58:39.482206 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.482162 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 15:58:39.482206 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.482178 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 15:58:39.482206 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.482189 2572 kubelet.go:397] "Adding apiserver pod source" Apr 22 15:58:39.482347 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.482217 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 15:58:39.484252 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.484115 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 15:58:39.484252 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.484141 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 15:58:39.484776 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.484758 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jnkh7" Apr 22 15:58:39.486923 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.486900 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 15:58:39.488187 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.488172 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 15:58:39.490149 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.490132 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 15:58:39.490149 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.490150 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 15:58:39.490294 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.490157 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 15:58:39.490294 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.490163 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 15:58:39.490294 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.490169 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 15:58:39.490294 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.490175 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 15:58:39.490294 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.490180 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 15:58:39.490294 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.490186 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 15:58:39.490294 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.490210 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 15:58:39.490294 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.490217 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 15:58:39.490294 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.490225 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 15:58:39.490294 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.490245 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 15:58:39.491129 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.491114 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 15:58:39.491165 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.491133 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 15:58:39.492150 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.492135 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jnkh7" Apr 22 15:58:39.492969 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:39.492946 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-9.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 15:58:39.493054 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:39.492968 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 15:58:39.495095 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.495080 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 15:58:39.495155 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.495127 2572 server.go:1295] "Started kubelet" Apr 22 15:58:39.495236 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.495187 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 15:58:39.495290 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.495245 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 15:58:39.498455 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.498427 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 15:58:39.500182 ip-10-0-135-9 systemd[1]: Started Kubernetes Kubelet. Apr 22 15:58:39.500644 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.500621 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 15:58:39.501727 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.501709 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 22 15:58:39.505577 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.505543 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-135-9.ec2.internal" not found Apr 22 15:58:39.507504 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:39.507482 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 15:58:39.508103 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.508090 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 15:58:39.508188 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.508164 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 15:58:39.508835 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.508813 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 15:58:39.508835 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.508816 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 15:58:39.508968 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.508848 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 15:58:39.508968 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.508927 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 22 15:58:39.508968 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.508935 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 22 15:58:39.509090 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:39.508995 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-9.ec2.internal\" not found" Apr 22 15:58:39.509560 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.509540 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 15:58:39.509560 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.509558 2572 factory.go:55] Registering systemd factory Apr 22 15:58:39.509728 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.509568 2572 factory.go:223] Registration of the systemd container factory successfully Apr 22 15:58:39.509842 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.509827 2572 factory.go:153] Registering CRI-O factory Apr 22 15:58:39.509892 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.509846 2572 factory.go:223] Registration of the crio container factory successfully Apr 22 15:58:39.509892 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.509872 2572 factory.go:103] Registering Raw factory Apr 22 15:58:39.509892 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.509883 2572 manager.go:1196] Started watching for new ooms in manager Apr 22 15:58:39.510730 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.510712 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:58:39.511450 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.511421 2572 manager.go:319] Starting recovery of all containers Apr 22 15:58:39.513490 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:39.513465 2572 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-135-9.ec2.internal\" not found" node="ip-10-0-135-9.ec2.internal" Apr 22 15:58:39.521044 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.521020 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-135-9.ec2.internal" not found Apr 22 15:58:39.521931 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.521759 2572 manager.go:324] Recovery completed Apr 22 15:58:39.526336 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.526315 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:58:39.529532 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.529507 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-9.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:58:39.529642 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.529548 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-9.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:58:39.529642 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.529563 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-9.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:58:39.530158 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.530142 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 15:58:39.530245 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.530159 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 15:58:39.530245 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.530179 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 22 15:58:39.532507 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.532494 2572 policy_none.go:49] "None policy: Start" Apr 22 15:58:39.532551 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.532511 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 15:58:39.532551 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.532522 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 22 15:58:39.577436 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.577408 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-135-9.ec2.internal" not found Apr 22 15:58:39.579694 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.578911 2572 manager.go:341] "Starting Device Plugin manager" Apr 22 15:58:39.579694 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:39.578957 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 15:58:39.579694 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.578972 2572 server.go:85] "Starting device plugin registration server" Apr 22 15:58:39.579694 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.579272 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 15:58:39.579694 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.579282 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 15:58:39.579694 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.579382 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 15:58:39.579694 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.579467 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 15:58:39.579694 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.579476 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 15:58:39.580101 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:39.580051 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 15:58:39.580101 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:39.580088 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-9.ec2.internal\" not found" Apr 22 15:58:39.638252 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.638184 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 15:58:39.639695 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.639674 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 15:58:39.639804 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.639703 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 15:58:39.639804 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.639727 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 15:58:39.639804 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.639735 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 15:58:39.639804 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:39.639775 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 15:58:39.642225 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.642183 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:58:39.680330 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.680237 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:58:39.681150 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.681132 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-9.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:58:39.681233 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.681169 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-9.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:58:39.681233 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.681179 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-9.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:58:39.681233 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.681223 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-9.ec2.internal" Apr 22 15:58:39.689675 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.689654 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-9.ec2.internal" Apr 22 15:58:39.689734 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:39.689685 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-9.ec2.internal\": node \"ip-10-0-135-9.ec2.internal\" not found" Apr 22 15:58:39.706747 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:39.706720 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-9.ec2.internal\" not found" Apr 22 15:58:39.740116 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.740064 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-9.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-9.ec2.internal"] Apr 22 15:58:39.740173 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.740162 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:58:39.741159 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.741144 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-9.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:58:39.741262 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.741174 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-9.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:58:39.741262 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.741209 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-9.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:58:39.742520 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.742507 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:58:39.742649 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.742634 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-9.ec2.internal" Apr 22 15:58:39.742690 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.742669 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:58:39.743258 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.743236 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-9.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:58:39.743258 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.743258 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-9.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:58:39.743411 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.743268 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-9.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:58:39.743411 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.743277 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-9.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:58:39.743411 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.743297 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-9.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:58:39.743411 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.743306 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-9.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:58:39.744300 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.744282 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-9.ec2.internal" Apr 22 15:58:39.744389 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.744311 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:58:39.744983 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.744965 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-9.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:58:39.745089 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.744995 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-9.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:58:39.745089 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.745009 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-9.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:58:39.761829 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:39.761795 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-9.ec2.internal\" not found" node="ip-10-0-135-9.ec2.internal" Apr 22 15:58:39.765930 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:39.765907 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-9.ec2.internal\" not found" node="ip-10-0-135-9.ec2.internal" Apr 22 15:58:39.806834 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:39.806784 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-9.ec2.internal\" not found" Apr 22 15:58:39.810469 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.810449 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c011865a41651b31b930def52edfe788-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-9.ec2.internal\" (UID: \"c011865a41651b31b930def52edfe788\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-9.ec2.internal" Apr 22 15:58:39.810542 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.810478 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c011865a41651b31b930def52edfe788-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-9.ec2.internal\" (UID: \"c011865a41651b31b930def52edfe788\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-9.ec2.internal" Apr 22 15:58:39.810542 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.810496 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4bba7b470f499a19673a3db16932ed93-config\") pod \"kube-apiserver-proxy-ip-10-0-135-9.ec2.internal\" (UID: \"4bba7b470f499a19673a3db16932ed93\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-9.ec2.internal" Apr 22 15:58:39.907529 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:39.907486 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-9.ec2.internal\" not found" Apr 22 15:58:39.910803 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.910783 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c011865a41651b31b930def52edfe788-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-9.ec2.internal\" (UID: \"c011865a41651b31b930def52edfe788\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-9.ec2.internal" Apr 22 15:58:39.910847 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.910813 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c011865a41651b31b930def52edfe788-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-9.ec2.internal\" (UID: \"c011865a41651b31b930def52edfe788\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-9.ec2.internal" Apr 22 15:58:39.910847 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.910832 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4bba7b470f499a19673a3db16932ed93-config\") pod \"kube-apiserver-proxy-ip-10-0-135-9.ec2.internal\" (UID: \"4bba7b470f499a19673a3db16932ed93\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-9.ec2.internal" Apr 22 15:58:39.910923 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.910873 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4bba7b470f499a19673a3db16932ed93-config\") pod \"kube-apiserver-proxy-ip-10-0-135-9.ec2.internal\" (UID: \"4bba7b470f499a19673a3db16932ed93\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-9.ec2.internal" Apr 22 15:58:39.910923 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.910895 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c011865a41651b31b930def52edfe788-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-9.ec2.internal\" (UID: \"c011865a41651b31b930def52edfe788\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-9.ec2.internal" Apr 22 15:58:39.910923 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:39.910896 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c011865a41651b31b930def52edfe788-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-9.ec2.internal\" (UID: \"c011865a41651b31b930def52edfe788\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-9.ec2.internal" Apr 22 15:58:40.008259 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:40.008157 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-9.ec2.internal\" not found" Apr 22 15:58:40.063604 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:40.063573 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-9.ec2.internal" Apr 22 15:58:40.068144 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:40.068126 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-9.ec2.internal" Apr 22 15:58:40.108731 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:40.108701 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-9.ec2.internal\" not found" Apr 22 15:58:40.209160 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:40.209125 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-9.ec2.internal\" not found" Apr 22 15:58:40.309742 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:40.309654 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-9.ec2.internal\" not found" Apr 22 15:58:40.410141 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:40.410105 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-9.ec2.internal\" not found" Apr 22 15:58:40.414332 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:40.414315 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 15:58:40.414480 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:40.414464 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 15:58:40.414528 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:40.414489 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 15:58:40.494026 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:40.493970 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 15:53:39 +0000 UTC" deadline="2027-10-18 09:52:53.853464213 +0000 UTC" Apr 22 15:58:40.494026 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:40.494020 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13049h54m13.359448473s" Apr 22 15:58:40.509142 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:40.509111 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 15:58:40.510214 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:40.510176 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-9.ec2.internal\" not found" Apr 22 15:58:40.531620 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:40.531588 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 15:58:40.550967 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:40.550940 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-rp9pd" Apr 22 15:58:40.552612 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:40.552580 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc011865a41651b31b930def52edfe788.slice/crio-7d2f357c1ab89b8f794609b7771b4e620c05fa65d280cbe8998cb4868c824d97 WatchSource:0}: Error finding container 7d2f357c1ab89b8f794609b7771b4e620c05fa65d280cbe8998cb4868c824d97: Status 404 returned error can't find the container with id 7d2f357c1ab89b8f794609b7771b4e620c05fa65d280cbe8998cb4868c824d97 Apr 22 15:58:40.552847 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:40.552822 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bba7b470f499a19673a3db16932ed93.slice/crio-bfd8ade5b0fe47ce339871f40813d657c123ef8fd3f2cb139b31554efe6d8d87 WatchSource:0}: Error finding container bfd8ade5b0fe47ce339871f40813d657c123ef8fd3f2cb139b31554efe6d8d87: Status 404 returned error can't find the container with id bfd8ade5b0fe47ce339871f40813d657c123ef8fd3f2cb139b31554efe6d8d87 Apr 22 15:58:40.557704 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:40.557686 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-rp9pd" Apr 22 15:58:40.558947 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:40.558934 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:58:40.610985 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:40.610948 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-9.ec2.internal\" not found" Apr 22 15:58:40.643210 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:40.643151 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-9.ec2.internal" event={"ID":"c011865a41651b31b930def52edfe788","Type":"ContainerStarted","Data":"7d2f357c1ab89b8f794609b7771b4e620c05fa65d280cbe8998cb4868c824d97"} Apr 22 15:58:40.644067 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:40.644038 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-9.ec2.internal" event={"ID":"4bba7b470f499a19673a3db16932ed93","Type":"ContainerStarted","Data":"bfd8ade5b0fe47ce339871f40813d657c123ef8fd3f2cb139b31554efe6d8d87"} Apr 22 15:58:40.711399 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:40.711372 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-9.ec2.internal\" not found" Apr 22 15:58:40.812014 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:40.811948 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-9.ec2.internal\" not found" Apr 22 15:58:40.823093 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:40.823063 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:58:40.912258 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:40.912221 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-9.ec2.internal\" not found" Apr 22 15:58:40.946721 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:40.946682 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:58:41.009161 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.008922 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-9.ec2.internal" Apr 22 15:58:41.020894 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.020767 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 15:58:41.021825 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.021803 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-9.ec2.internal" Apr 22 15:58:41.028858 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.028837 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 15:58:41.360451 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.360409 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:58:41.483140 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.483106 2572 apiserver.go:52] "Watching apiserver" Apr 22 15:58:41.489121 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.489098 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 15:58:41.489527 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.489501 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xxznf","kube-system/konnectivity-agent-nbrr9","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh","openshift-cluster-node-tuning-operator/tuned-ttmhq","openshift-dns/node-resolver-4sqfs","openshift-image-registry/node-ca-9d6jl","openshift-multus/network-metrics-daemon-76x4b","kube-system/kube-apiserver-proxy-ip-10-0-135-9.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-9.ec2.internal","openshift-multus/multus-additional-cni-plugins-4tvm4","openshift-multus/multus-cmd9q","openshift-network-diagnostics/network-check-target-dqwgt","openshift-network-operator/iptables-alerter-2q74x"] Apr 22 15:58:41.493414 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.493385 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:58:41.493527 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:41.493472 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-76x4b" podUID="c090a1ee-5091-44d6-9e1b-65bf4dc8b1be" Apr 22 15:58:41.495386 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.495361 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nbrr9" Apr 22 15:58:41.495503 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.495462 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" Apr 22 15:58:41.498289 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.498261 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 15:58:41.498718 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.498699 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 15:58:41.499035 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.499018 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8w28x\"" Apr 22 15:58:41.499366 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.499347 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 15:58:41.500422 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.500182 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 15:58:41.500422 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.500378 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-wnwkd\"" Apr 22 15:58:41.500977 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.500956 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 15:58:41.502395 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.502372 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4sqfs" Apr 22 15:58:41.502395 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.502386 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.502689 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.502671 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9d6jl" Apr 22 15:58:41.504761 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.504740 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.505211 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.505158 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-hfl9r\"" Apr 22 15:58:41.505364 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.505262 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 15:58:41.505364 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.505309 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 15:58:41.505494 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.505386 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-pmvxt\"" Apr 22 15:58:41.505494 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.505480 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 15:58:41.506179 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.505675 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 15:58:41.506179 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.505874 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 15:58:41.506179 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.505944 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 15:58:41.506179 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.505960 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-7szk9\"" Apr 22 15:58:41.506179 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.506045 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:58:41.507177 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.507161 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 15:58:41.507460 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.507438 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 15:58:41.508165 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.507883 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 15:58:41.508165 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.507961 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-d8m5g\"" Apr 22 15:58:41.508165 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.508008 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 15:58:41.508165 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.507963 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 15:58:41.508165 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.508010 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 15:58:41.508827 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.508807 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.508932 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.508888 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.511112 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.511073 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 15:58:41.511276 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.511176 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 15:58:41.511276 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.511222 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:58:41.511392 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:41.511290 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dqwgt" podUID="462dda43-d18b-4f55-b5d0-d9b9cbbb2e60" Apr 22 15:58:41.511495 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.511477 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 15:58:41.511554 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.511504 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 15:58:41.511604 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.511570 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 15:58:41.511745 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.511717 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 15:58:41.511821 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.511806 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mfpfn\"" Apr 22 15:58:41.511939 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.511926 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-pqx4x\"" Apr 22 15:58:41.513448 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.513432 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2q74x" Apr 22 15:58:41.516125 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.515712 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-4cp5h\"" Apr 22 15:58:41.516125 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.515723 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 15:58:41.516125 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.515826 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 15:58:41.516125 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.515956 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:58:41.518918 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.518895 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzh9d\" (UniqueName: \"kubernetes.io/projected/a60066e5-252b-4865-879a-0d0d3a6618d4-kube-api-access-lzh9d\") pod \"node-resolver-4sqfs\" (UID: \"a60066e5-252b-4865-879a-0d0d3a6618d4\") " pod="openshift-dns/node-resolver-4sqfs" Apr 22 15:58:41.519022 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.518930 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/757bc440-2a2c-42f8-8e5d-03be90e55484-cni-binary-copy\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.519022 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.518953 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5756e223-5da3-420b-a640-5e3cdce35004-serviceca\") pod \"node-ca-9d6jl\" (UID: \"5756e223-5da3-420b-a640-5e3cdce35004\") " pod="openshift-image-registry/node-ca-9d6jl" Apr 22 15:58:41.519022 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.518975 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fd3073fe-435c-4974-821b-9229018bf5f4-env-overrides\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.519022 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519001 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a1b11795-9e34-41fd-9198-cc57fa3cfbf7-cnibin\") pod \"multus-additional-cni-plugins-4tvm4\" (UID: \"a1b11795-9e34-41fd-9198-cc57fa3cfbf7\") " pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.519236 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519026 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a1b11795-9e34-41fd-9198-cc57fa3cfbf7-os-release\") pod \"multus-additional-cni-plugins-4tvm4\" (UID: \"a1b11795-9e34-41fd-9198-cc57fa3cfbf7\") " pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.519236 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519054 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-systemd-units\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.519236 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519077 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-host-slash\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.519236 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519100 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-multus-cni-dir\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.519236 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519123 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-hostroot\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.519236 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519148 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-etc-kubernetes\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.519236 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519171 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-var-lib-openvswitch\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.519236 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519215 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-host-run-ovn-kubernetes\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.519634 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519243 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7cj9\" (UniqueName: \"kubernetes.io/projected/a1b11795-9e34-41fd-9198-cc57fa3cfbf7-kube-api-access-f7cj9\") pod \"multus-additional-cni-plugins-4tvm4\" (UID: \"a1b11795-9e34-41fd-9198-cc57fa3cfbf7\") " pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.519634 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519266 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a60066e5-252b-4865-879a-0d0d3a6618d4-hosts-file\") pod \"node-resolver-4sqfs\" (UID: \"a60066e5-252b-4865-879a-0d0d3a6618d4\") " pod="openshift-dns/node-resolver-4sqfs" Apr 22 15:58:41.519634 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519316 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0b05e334-2590-45f8-bdfc-e5f6d56bfea3-registration-dir\") pod \"aws-ebs-csi-driver-node-q76xh\" (UID: \"0b05e334-2590-45f8-bdfc-e5f6d56bfea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" Apr 22 15:58:41.519634 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519351 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c56fk\" (UniqueName: \"kubernetes.io/projected/8071611a-9b57-488d-9246-ad02e7c43ccb-kube-api-access-c56fk\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.519634 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519380 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-host-run-netns\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.519634 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519403 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-run-openvswitch\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.519634 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519428 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs\") pod \"network-metrics-daemon-76x4b\" (UID: \"c090a1ee-5091-44d6-9e1b-65bf4dc8b1be\") " pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:58:41.519634 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519466 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-host\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.519634 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519505 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a1b11795-9e34-41fd-9198-cc57fa3cfbf7-system-cni-dir\") pod \"multus-additional-cni-plugins-4tvm4\" (UID: \"a1b11795-9e34-41fd-9198-cc57fa3cfbf7\") " pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.519634 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519530 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-system-cni-dir\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.519634 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519570 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-multus-socket-dir-parent\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.519634 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519615 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-multus-conf-dir\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.520125 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519655 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0b05e334-2590-45f8-bdfc-e5f6d56bfea3-socket-dir\") pod \"aws-ebs-csi-driver-node-q76xh\" (UID: \"0b05e334-2590-45f8-bdfc-e5f6d56bfea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" Apr 22 15:58:41.520125 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519682 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a1b11795-9e34-41fd-9198-cc57fa3cfbf7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4tvm4\" (UID: \"a1b11795-9e34-41fd-9198-cc57fa3cfbf7\") " pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.520125 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519708 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a60066e5-252b-4865-879a-0d0d3a6618d4-tmp-dir\") pod \"node-resolver-4sqfs\" (UID: \"a60066e5-252b-4865-879a-0d0d3a6618d4\") " pod="openshift-dns/node-resolver-4sqfs" Apr 22 15:58:41.520125 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519738 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0b05e334-2590-45f8-bdfc-e5f6d56bfea3-device-dir\") pod \"aws-ebs-csi-driver-node-q76xh\" (UID: \"0b05e334-2590-45f8-bdfc-e5f6d56bfea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" Apr 22 15:58:41.520125 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519769 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0b05e334-2590-45f8-bdfc-e5f6d56bfea3-etc-selinux\") pod \"aws-ebs-csi-driver-node-q76xh\" (UID: \"0b05e334-2590-45f8-bdfc-e5f6d56bfea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" Apr 22 15:58:41.520125 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519791 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0b05e334-2590-45f8-bdfc-e5f6d56bfea3-sys-fs\") pod \"aws-ebs-csi-driver-node-q76xh\" (UID: \"0b05e334-2590-45f8-bdfc-e5f6d56bfea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" Apr 22 15:58:41.520125 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519846 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-etc-kubernetes\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.520125 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519882 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-lib-modules\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.520125 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519916 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-var-lib-kubelet\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.520125 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519947 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a1b11795-9e34-41fd-9198-cc57fa3cfbf7-cni-binary-copy\") pod \"multus-additional-cni-plugins-4tvm4\" (UID: \"a1b11795-9e34-41fd-9198-cc57fa3cfbf7\") " pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.520125 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.519984 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjkfx\" (UniqueName: \"kubernetes.io/projected/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-kube-api-access-sjkfx\") pod \"network-metrics-daemon-76x4b\" (UID: \"c090a1ee-5091-44d6-9e1b-65bf4dc8b1be\") " pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:58:41.520125 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520018 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-os-release\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.520125 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520051 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-host-var-lib-cni-multus\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.520125 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520075 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-host-run-multus-certs\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.520125 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520091 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7m8g\" (UniqueName: \"kubernetes.io/projected/462dda43-d18b-4f55-b5d0-d9b9cbbb2e60-kube-api-access-x7m8g\") pod \"network-check-target-dqwgt\" (UID: \"462dda43-d18b-4f55-b5d0-d9b9cbbb2e60\") " pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:58:41.520125 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520104 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-run-ovn\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.520751 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520119 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a1b11795-9e34-41fd-9198-cc57fa3cfbf7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4tvm4\" (UID: \"a1b11795-9e34-41fd-9198-cc57fa3cfbf7\") " pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.520751 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520162 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-host-var-lib-kubelet\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.520751 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520213 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-etc-systemd\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.520751 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520242 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8071611a-9b57-488d-9246-ad02e7c43ccb-etc-tuned\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.520751 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520265 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fd3073fe-435c-4974-821b-9229018bf5f4-ovnkube-config\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.520751 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520304 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fd3073fe-435c-4974-821b-9229018bf5f4-ovn-node-metrics-cert\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.520751 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520332 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fd3073fe-435c-4974-821b-9229018bf5f4-ovnkube-script-lib\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.520751 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520356 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.520751 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520380 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-host-var-lib-cni-bin\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.520751 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520407 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b05e334-2590-45f8-bdfc-e5f6d56bfea3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-q76xh\" (UID: \"0b05e334-2590-45f8-bdfc-e5f6d56bfea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" Apr 22 15:58:41.520751 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520433 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-etc-sysconfig\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.520751 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520459 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-host-cni-bin\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.520751 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520483 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-cnibin\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.520751 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520511 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-host-run-netns\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.520751 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520546 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9ebd4e04-7111-4378-9b6d-f2d25a0e4642-agent-certs\") pod \"konnectivity-agent-nbrr9\" (UID: \"9ebd4e04-7111-4378-9b6d-f2d25a0e4642\") " pod="kube-system/konnectivity-agent-nbrr9" Apr 22 15:58:41.520751 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520562 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-etc-openvswitch\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.521266 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520576 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/757bc440-2a2c-42f8-8e5d-03be90e55484-multus-daemon-config\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.521266 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520590 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-758c8\" (UniqueName: \"kubernetes.io/projected/757bc440-2a2c-42f8-8e5d-03be90e55484-kube-api-access-758c8\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.521266 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520609 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9ebd4e04-7111-4378-9b6d-f2d25a0e4642-konnectivity-ca\") pod \"konnectivity-agent-nbrr9\" (UID: \"9ebd4e04-7111-4378-9b6d-f2d25a0e4642\") " pod="kube-system/konnectivity-agent-nbrr9" Apr 22 15:58:41.521266 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520627 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-sys\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.521266 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520642 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgj2h\" (UniqueName: \"kubernetes.io/projected/5756e223-5da3-420b-a640-5e3cdce35004-kube-api-access-qgj2h\") pod \"node-ca-9d6jl\" (UID: \"5756e223-5da3-420b-a640-5e3cdce35004\") " pod="openshift-image-registry/node-ca-9d6jl" Apr 22 15:58:41.521266 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520657 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-host-kubelet\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.521266 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520671 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-run-systemd\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.521266 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520691 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-log-socket\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.521266 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520707 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-host-cni-netd\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.521266 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520721 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv6z4\" (UniqueName: \"kubernetes.io/projected/fd3073fe-435c-4974-821b-9229018bf5f4-kube-api-access-kv6z4\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.521266 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520735 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4q84\" (UniqueName: \"kubernetes.io/projected/0b05e334-2590-45f8-bdfc-e5f6d56bfea3-kube-api-access-c4q84\") pod \"aws-ebs-csi-driver-node-q76xh\" (UID: \"0b05e334-2590-45f8-bdfc-e5f6d56bfea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" Apr 22 15:58:41.521266 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520755 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-etc-modprobe-d\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.521266 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520778 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-etc-sysctl-conf\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.521266 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520801 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a1b11795-9e34-41fd-9198-cc57fa3cfbf7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4tvm4\" (UID: \"a1b11795-9e34-41fd-9198-cc57fa3cfbf7\") " pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.521266 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520838 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-node-log\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.521266 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520864 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-host-run-k8s-cni-cncf-io\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.521886 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520897 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-etc-sysctl-d\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.521886 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520931 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-run\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.521886 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520954 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8071611a-9b57-488d-9246-ad02e7c43ccb-tmp\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.521886 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.520978 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5756e223-5da3-420b-a640-5e3cdce35004-host\") pod \"node-ca-9d6jl\" (UID: \"5756e223-5da3-420b-a640-5e3cdce35004\") " pod="openshift-image-registry/node-ca-9d6jl" Apr 22 15:58:41.558929 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.558896 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 15:53:40 +0000 UTC" deadline="2027-10-22 14:40:01.419797298 +0000 UTC" Apr 22 15:58:41.558929 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.558927 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13150h41m19.860873555s" Apr 22 15:58:41.609728 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.609695 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 15:58:41.621263 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621144 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fd3073fe-435c-4974-821b-9229018bf5f4-env-overrides\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.621263 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621181 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a1b11795-9e34-41fd-9198-cc57fa3cfbf7-cnibin\") pod \"multus-additional-cni-plugins-4tvm4\" (UID: \"a1b11795-9e34-41fd-9198-cc57fa3cfbf7\") " pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.621263 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621227 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a1b11795-9e34-41fd-9198-cc57fa3cfbf7-os-release\") pod \"multus-additional-cni-plugins-4tvm4\" (UID: \"a1b11795-9e34-41fd-9198-cc57fa3cfbf7\") " pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.621517 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621273 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-systemd-units\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.621517 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621312 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a1b11795-9e34-41fd-9198-cc57fa3cfbf7-os-release\") pod \"multus-additional-cni-plugins-4tvm4\" (UID: \"a1b11795-9e34-41fd-9198-cc57fa3cfbf7\") " pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.621517 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621340 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-host-slash\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.621517 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621356 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a1b11795-9e34-41fd-9198-cc57fa3cfbf7-cnibin\") pod \"multus-additional-cni-plugins-4tvm4\" (UID: \"a1b11795-9e34-41fd-9198-cc57fa3cfbf7\") " pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.621517 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621366 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-systemd-units\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.621517 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621364 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-multus-cni-dir\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.621517 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621403 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-host-slash\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.621517 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621413 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-hostroot\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.621517 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621449 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-etc-kubernetes\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.621517 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621479 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-var-lib-openvswitch\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.621517 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621478 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-hostroot\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.621517 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621495 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-etc-kubernetes\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.621517 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621501 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-host-run-ovn-kubernetes\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.621517 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621503 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-multus-cni-dir\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.622065 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621526 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-var-lib-openvswitch\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.622065 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621534 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-host-run-ovn-kubernetes\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.622065 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621539 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7cj9\" (UniqueName: \"kubernetes.io/projected/a1b11795-9e34-41fd-9198-cc57fa3cfbf7-kube-api-access-f7cj9\") pod \"multus-additional-cni-plugins-4tvm4\" (UID: \"a1b11795-9e34-41fd-9198-cc57fa3cfbf7\") " pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.622065 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621568 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a60066e5-252b-4865-879a-0d0d3a6618d4-hosts-file\") pod \"node-resolver-4sqfs\" (UID: \"a60066e5-252b-4865-879a-0d0d3a6618d4\") " pod="openshift-dns/node-resolver-4sqfs" Apr 22 15:58:41.622065 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621593 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0b05e334-2590-45f8-bdfc-e5f6d56bfea3-registration-dir\") pod \"aws-ebs-csi-driver-node-q76xh\" (UID: \"0b05e334-2590-45f8-bdfc-e5f6d56bfea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" Apr 22 15:58:41.622065 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621641 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c56fk\" (UniqueName: \"kubernetes.io/projected/8071611a-9b57-488d-9246-ad02e7c43ccb-kube-api-access-c56fk\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.622065 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621643 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a60066e5-252b-4865-879a-0d0d3a6618d4-hosts-file\") pod \"node-resolver-4sqfs\" (UID: \"a60066e5-252b-4865-879a-0d0d3a6618d4\") " pod="openshift-dns/node-resolver-4sqfs" Apr 22 15:58:41.622065 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621666 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-host-run-netns\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.622065 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621712 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0b05e334-2590-45f8-bdfc-e5f6d56bfea3-registration-dir\") pod \"aws-ebs-csi-driver-node-q76xh\" (UID: \"0b05e334-2590-45f8-bdfc-e5f6d56bfea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" Apr 22 15:58:41.622065 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621715 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-host-run-netns\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.622065 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621814 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fd3073fe-435c-4974-821b-9229018bf5f4-env-overrides\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.622065 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621854 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-run-openvswitch\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.622065 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621820 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-run-openvswitch\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.622065 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621894 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs\") pod \"network-metrics-daemon-76x4b\" (UID: \"c090a1ee-5091-44d6-9e1b-65bf4dc8b1be\") " pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:58:41.622065 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621917 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-host\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.622065 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621934 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a1b11795-9e34-41fd-9198-cc57fa3cfbf7-system-cni-dir\") pod \"multus-additional-cni-plugins-4tvm4\" (UID: \"a1b11795-9e34-41fd-9198-cc57fa3cfbf7\") " pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.622065 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621951 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-system-cni-dir\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.622815 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621996 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-host\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.622815 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622001 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-system-cni-dir\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.622815 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.621991 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a1b11795-9e34-41fd-9198-cc57fa3cfbf7-system-cni-dir\") pod \"multus-additional-cni-plugins-4tvm4\" (UID: \"a1b11795-9e34-41fd-9198-cc57fa3cfbf7\") " pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.622815 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622013 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-multus-socket-dir-parent\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.622815 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622042 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-multus-conf-dir\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.622815 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:41.622057 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:41.622815 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622043 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-multus-socket-dir-parent\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.622815 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622059 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0b05e334-2590-45f8-bdfc-e5f6d56bfea3-socket-dir\") pod \"aws-ebs-csi-driver-node-q76xh\" (UID: \"0b05e334-2590-45f8-bdfc-e5f6d56bfea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" Apr 22 15:58:41.622815 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622108 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-multus-conf-dir\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.622815 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:41.622132 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs podName:c090a1ee-5091-44d6-9e1b-65bf4dc8b1be nodeName:}" failed. No retries permitted until 2026-04-22 15:58:42.122117128 +0000 UTC m=+3.021671177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs") pod "network-metrics-daemon-76x4b" (UID: "c090a1ee-5091-44d6-9e1b-65bf4dc8b1be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:41.622815 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622134 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0b05e334-2590-45f8-bdfc-e5f6d56bfea3-socket-dir\") pod \"aws-ebs-csi-driver-node-q76xh\" (UID: \"0b05e334-2590-45f8-bdfc-e5f6d56bfea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" Apr 22 15:58:41.622815 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622173 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a1b11795-9e34-41fd-9198-cc57fa3cfbf7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4tvm4\" (UID: \"a1b11795-9e34-41fd-9198-cc57fa3cfbf7\") " pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.622815 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622215 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a60066e5-252b-4865-879a-0d0d3a6618d4-tmp-dir\") pod \"node-resolver-4sqfs\" (UID: \"a60066e5-252b-4865-879a-0d0d3a6618d4\") " pod="openshift-dns/node-resolver-4sqfs" Apr 22 15:58:41.622815 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622239 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0b05e334-2590-45f8-bdfc-e5f6d56bfea3-device-dir\") pod \"aws-ebs-csi-driver-node-q76xh\" (UID: \"0b05e334-2590-45f8-bdfc-e5f6d56bfea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" Apr 22 15:58:41.622815 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622257 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0b05e334-2590-45f8-bdfc-e5f6d56bfea3-etc-selinux\") pod \"aws-ebs-csi-driver-node-q76xh\" (UID: \"0b05e334-2590-45f8-bdfc-e5f6d56bfea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" Apr 22 15:58:41.622815 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622281 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0b05e334-2590-45f8-bdfc-e5f6d56bfea3-sys-fs\") pod \"aws-ebs-csi-driver-node-q76xh\" (UID: \"0b05e334-2590-45f8-bdfc-e5f6d56bfea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" Apr 22 15:58:41.622815 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622305 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0b05e334-2590-45f8-bdfc-e5f6d56bfea3-device-dir\") pod \"aws-ebs-csi-driver-node-q76xh\" (UID: \"0b05e334-2590-45f8-bdfc-e5f6d56bfea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" Apr 22 15:58:41.623596 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622344 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-etc-kubernetes\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.623596 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622357 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0b05e334-2590-45f8-bdfc-e5f6d56bfea3-sys-fs\") pod \"aws-ebs-csi-driver-node-q76xh\" (UID: \"0b05e334-2590-45f8-bdfc-e5f6d56bfea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" Apr 22 15:58:41.623596 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622375 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-lib-modules\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.623596 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622388 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0b05e334-2590-45f8-bdfc-e5f6d56bfea3-etc-selinux\") pod \"aws-ebs-csi-driver-node-q76xh\" (UID: \"0b05e334-2590-45f8-bdfc-e5f6d56bfea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" Apr 22 15:58:41.623596 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622390 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-etc-kubernetes\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.623596 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622400 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-var-lib-kubelet\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.623596 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622446 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-var-lib-kubelet\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.623596 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622453 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a1b11795-9e34-41fd-9198-cc57fa3cfbf7-cni-binary-copy\") pod \"multus-additional-cni-plugins-4tvm4\" (UID: \"a1b11795-9e34-41fd-9198-cc57fa3cfbf7\") " pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.623596 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622481 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjkfx\" (UniqueName: \"kubernetes.io/projected/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-kube-api-access-sjkfx\") pod \"network-metrics-daemon-76x4b\" (UID: \"c090a1ee-5091-44d6-9e1b-65bf4dc8b1be\") " pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:58:41.623596 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622498 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-lib-modules\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.623596 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622507 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-os-release\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.623596 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622539 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a60066e5-252b-4865-879a-0d0d3a6618d4-tmp-dir\") pod \"node-resolver-4sqfs\" (UID: \"a60066e5-252b-4865-879a-0d0d3a6618d4\") " pod="openshift-dns/node-resolver-4sqfs" Apr 22 15:58:41.623596 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622552 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-host-var-lib-cni-multus\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.623596 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622578 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-host-run-multus-certs\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.623596 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622599 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-host-var-lib-cni-multus\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.623596 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622603 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7m8g\" (UniqueName: \"kubernetes.io/projected/462dda43-d18b-4f55-b5d0-d9b9cbbb2e60-kube-api-access-x7m8g\") pod \"network-check-target-dqwgt\" (UID: \"462dda43-d18b-4f55-b5d0-d9b9cbbb2e60\") " pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:58:41.623596 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622632 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-run-ovn\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.624211 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622650 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-os-release\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.624211 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622653 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-host-run-multus-certs\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.624211 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622658 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a1b11795-9e34-41fd-9198-cc57fa3cfbf7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4tvm4\" (UID: \"a1b11795-9e34-41fd-9198-cc57fa3cfbf7\") " pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.624211 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622696 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-host-var-lib-kubelet\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.624211 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622700 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-run-ovn\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.624211 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622729 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-etc-systemd\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.624211 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622733 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-host-var-lib-kubelet\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.624211 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622779 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-etc-systemd\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.624211 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622785 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a1b11795-9e34-41fd-9198-cc57fa3cfbf7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4tvm4\" (UID: \"a1b11795-9e34-41fd-9198-cc57fa3cfbf7\") " pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.624211 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622799 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8071611a-9b57-488d-9246-ad02e7c43ccb-etc-tuned\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.624211 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622838 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fd3073fe-435c-4974-821b-9229018bf5f4-ovnkube-config\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.624211 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622898 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fd3073fe-435c-4974-821b-9229018bf5f4-ovn-node-metrics-cert\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.624211 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622929 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fd3073fe-435c-4974-821b-9229018bf5f4-ovnkube-script-lib\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.624211 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622954 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.624211 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.622984 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a1b11795-9e34-41fd-9198-cc57fa3cfbf7-cni-binary-copy\") pod \"multus-additional-cni-plugins-4tvm4\" (UID: \"a1b11795-9e34-41fd-9198-cc57fa3cfbf7\") " pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.624211 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623007 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-host-var-lib-cni-bin\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.624211 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623033 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b05e334-2590-45f8-bdfc-e5f6d56bfea3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-q76xh\" (UID: \"0b05e334-2590-45f8-bdfc-e5f6d56bfea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" Apr 22 15:58:41.625246 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623057 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-etc-sysconfig\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.625246 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623083 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-host-cni-bin\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.625246 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623109 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-cnibin\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.625246 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623115 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 15:58:41.625246 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623132 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-host-run-netns\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.625246 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623169 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-etc-sysconfig\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.625246 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623169 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9ebd4e04-7111-4378-9b6d-f2d25a0e4642-agent-certs\") pod \"konnectivity-agent-nbrr9\" (UID: \"9ebd4e04-7111-4378-9b6d-f2d25a0e4642\") " pod="kube-system/konnectivity-agent-nbrr9" Apr 22 15:58:41.625246 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623230 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f4e23e3e-8847-4777-b278-f9b1ef808fd3-iptables-alerter-script\") pod \"iptables-alerter-2q74x\" (UID: \"f4e23e3e-8847-4777-b278-f9b1ef808fd3\") " pod="openshift-network-operator/iptables-alerter-2q74x" Apr 22 15:58:41.625246 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623245 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-host-var-lib-cni-bin\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.625246 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623254 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-etc-openvswitch\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.625246 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623271 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/757bc440-2a2c-42f8-8e5d-03be90e55484-multus-daemon-config\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.625246 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623296 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-758c8\" (UniqueName: \"kubernetes.io/projected/757bc440-2a2c-42f8-8e5d-03be90e55484-kube-api-access-758c8\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.625246 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623302 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b05e334-2590-45f8-bdfc-e5f6d56bfea3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-q76xh\" (UID: \"0b05e334-2590-45f8-bdfc-e5f6d56bfea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" Apr 22 15:58:41.625246 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623318 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9ebd4e04-7111-4378-9b6d-f2d25a0e4642-konnectivity-ca\") pod \"konnectivity-agent-nbrr9\" (UID: \"9ebd4e04-7111-4378-9b6d-f2d25a0e4642\") " pod="kube-system/konnectivity-agent-nbrr9" Apr 22 15:58:41.625246 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623339 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-sys\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.625246 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623342 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-etc-openvswitch\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.625246 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623354 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgj2h\" (UniqueName: \"kubernetes.io/projected/5756e223-5da3-420b-a640-5e3cdce35004-kube-api-access-qgj2h\") pod \"node-ca-9d6jl\" (UID: \"5756e223-5da3-420b-a640-5e3cdce35004\") " pod="openshift-image-registry/node-ca-9d6jl" Apr 22 15:58:41.625246 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623372 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f4e23e3e-8847-4777-b278-f9b1ef808fd3-host-slash\") pod \"iptables-alerter-2q74x\" (UID: \"f4e23e3e-8847-4777-b278-f9b1ef808fd3\") " pod="openshift-network-operator/iptables-alerter-2q74x" Apr 22 15:58:41.626076 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623388 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-host-kubelet\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.626076 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623408 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-run-systemd\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.626076 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623432 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-log-socket\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.626076 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623456 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-host-cni-netd\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.626076 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623482 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kv6z4\" (UniqueName: \"kubernetes.io/projected/fd3073fe-435c-4974-821b-9229018bf5f4-kube-api-access-kv6z4\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.626076 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623508 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c4q84\" (UniqueName: \"kubernetes.io/projected/0b05e334-2590-45f8-bdfc-e5f6d56bfea3-kube-api-access-c4q84\") pod \"aws-ebs-csi-driver-node-q76xh\" (UID: \"0b05e334-2590-45f8-bdfc-e5f6d56bfea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" Apr 22 15:58:41.626076 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623533 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-etc-modprobe-d\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.626076 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623557 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-etc-sysctl-conf\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.626076 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623582 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a1b11795-9e34-41fd-9198-cc57fa3cfbf7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4tvm4\" (UID: \"a1b11795-9e34-41fd-9198-cc57fa3cfbf7\") " pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.626076 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623608 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-node-log\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.626076 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623635 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-host-run-k8s-cni-cncf-io\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.626076 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623660 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-etc-sysctl-d\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.626076 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623683 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-run\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.626076 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623715 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xtgg\" (UniqueName: \"kubernetes.io/projected/f4e23e3e-8847-4777-b278-f9b1ef808fd3-kube-api-access-2xtgg\") pod \"iptables-alerter-2q74x\" (UID: \"f4e23e3e-8847-4777-b278-f9b1ef808fd3\") " pod="openshift-network-operator/iptables-alerter-2q74x" Apr 22 15:58:41.626076 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623742 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8071611a-9b57-488d-9246-ad02e7c43ccb-tmp\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.626076 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623766 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5756e223-5da3-420b-a640-5e3cdce35004-host\") pod \"node-ca-9d6jl\" (UID: \"5756e223-5da3-420b-a640-5e3cdce35004\") " pod="openshift-image-registry/node-ca-9d6jl" Apr 22 15:58:41.626076 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623779 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fd3073fe-435c-4974-821b-9229018bf5f4-ovnkube-config\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.626880 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623791 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzh9d\" (UniqueName: \"kubernetes.io/projected/a60066e5-252b-4865-879a-0d0d3a6618d4-kube-api-access-lzh9d\") pod \"node-resolver-4sqfs\" (UID: \"a60066e5-252b-4865-879a-0d0d3a6618d4\") " pod="openshift-dns/node-resolver-4sqfs" Apr 22 15:58:41.626880 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623822 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/757bc440-2a2c-42f8-8e5d-03be90e55484-cni-binary-copy\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.626880 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623846 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5756e223-5da3-420b-a640-5e3cdce35004-serviceca\") pod \"node-ca-9d6jl\" (UID: \"5756e223-5da3-420b-a640-5e3cdce35004\") " pod="openshift-image-registry/node-ca-9d6jl" Apr 22 15:58:41.626880 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623131 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a1b11795-9e34-41fd-9198-cc57fa3cfbf7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4tvm4\" (UID: \"a1b11795-9e34-41fd-9198-cc57fa3cfbf7\") " pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.626880 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.623055 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.626880 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.624094 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-sys\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.626880 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.624114 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-run-systemd\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.626880 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.624120 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-etc-modprobe-d\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.626880 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.624152 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-log-socket\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.626880 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.624270 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5756e223-5da3-420b-a640-5e3cdce35004-serviceca\") pod \"node-ca-9d6jl\" (UID: \"5756e223-5da3-420b-a640-5e3cdce35004\") " pod="openshift-image-registry/node-ca-9d6jl" Apr 22 15:58:41.626880 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.624341 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-host-kubelet\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.626880 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.624375 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-host-cni-netd\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.626880 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.624496 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/757bc440-2a2c-42f8-8e5d-03be90e55484-multus-daemon-config\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.626880 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.624568 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-host-run-k8s-cni-cncf-io\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.626880 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.624643 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9ebd4e04-7111-4378-9b6d-f2d25a0e4642-konnectivity-ca\") pod \"konnectivity-agent-nbrr9\" (UID: \"9ebd4e04-7111-4378-9b6d-f2d25a0e4642\") " pod="kube-system/konnectivity-agent-nbrr9" Apr 22 15:58:41.626880 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.624677 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a1b11795-9e34-41fd-9198-cc57fa3cfbf7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4tvm4\" (UID: \"a1b11795-9e34-41fd-9198-cc57fa3cfbf7\") " pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.626880 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.624693 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-run\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.626880 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.624647 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-etc-sysctl-conf\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.627524 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.624718 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-node-log\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.627524 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.624751 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8071611a-9b57-488d-9246-ad02e7c43ccb-etc-sysctl-d\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.627524 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.624785 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5756e223-5da3-420b-a640-5e3cdce35004-host\") pod \"node-ca-9d6jl\" (UID: \"5756e223-5da3-420b-a640-5e3cdce35004\") " pod="openshift-image-registry/node-ca-9d6jl" Apr 22 15:58:41.627524 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.624830 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd3073fe-435c-4974-821b-9229018bf5f4-host-cni-bin\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.627524 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.624855 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-cnibin\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.627524 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.624908 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/757bc440-2a2c-42f8-8e5d-03be90e55484-host-run-netns\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.627524 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.625287 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fd3073fe-435c-4974-821b-9229018bf5f4-ovnkube-script-lib\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.627524 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.625289 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/757bc440-2a2c-42f8-8e5d-03be90e55484-cni-binary-copy\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.627524 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.627083 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9ebd4e04-7111-4378-9b6d-f2d25a0e4642-agent-certs\") pod \"konnectivity-agent-nbrr9\" (UID: \"9ebd4e04-7111-4378-9b6d-f2d25a0e4642\") " pod="kube-system/konnectivity-agent-nbrr9" Apr 22 15:58:41.627524 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.627353 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fd3073fe-435c-4974-821b-9229018bf5f4-ovn-node-metrics-cert\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.627524 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.627391 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8071611a-9b57-488d-9246-ad02e7c43ccb-tmp\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.627524 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.627423 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8071611a-9b57-488d-9246-ad02e7c43ccb-etc-tuned\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.629741 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:41.629451 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:58:41.629741 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:41.629473 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:58:41.629741 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:41.629486 2572 projected.go:194] Error preparing data for projected volume kube-api-access-x7m8g for pod openshift-network-diagnostics/network-check-target-dqwgt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:41.629741 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:41.629558 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/462dda43-d18b-4f55-b5d0-d9b9cbbb2e60-kube-api-access-x7m8g podName:462dda43-d18b-4f55-b5d0-d9b9cbbb2e60 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:42.129538806 +0000 UTC m=+3.029092856 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-x7m8g" (UniqueName: "kubernetes.io/projected/462dda43-d18b-4f55-b5d0-d9b9cbbb2e60-kube-api-access-x7m8g") pod "network-check-target-dqwgt" (UID: "462dda43-d18b-4f55-b5d0-d9b9cbbb2e60") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:41.630938 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.630897 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7cj9\" (UniqueName: \"kubernetes.io/projected/a1b11795-9e34-41fd-9198-cc57fa3cfbf7-kube-api-access-f7cj9\") pod \"multus-additional-cni-plugins-4tvm4\" (UID: \"a1b11795-9e34-41fd-9198-cc57fa3cfbf7\") " pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.631628 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.631602 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-758c8\" (UniqueName: \"kubernetes.io/projected/757bc440-2a2c-42f8-8e5d-03be90e55484-kube-api-access-758c8\") pod \"multus-cmd9q\" (UID: \"757bc440-2a2c-42f8-8e5d-03be90e55484\") " pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.632492 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.632028 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4q84\" (UniqueName: \"kubernetes.io/projected/0b05e334-2590-45f8-bdfc-e5f6d56bfea3-kube-api-access-c4q84\") pod \"aws-ebs-csi-driver-node-q76xh\" (UID: \"0b05e334-2590-45f8-bdfc-e5f6d56bfea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" Apr 22 15:58:41.632492 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.632338 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjkfx\" (UniqueName: \"kubernetes.io/projected/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-kube-api-access-sjkfx\") pod \"network-metrics-daemon-76x4b\" (UID: \"c090a1ee-5091-44d6-9e1b-65bf4dc8b1be\") " pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:58:41.632492 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.632453 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c56fk\" (UniqueName: \"kubernetes.io/projected/8071611a-9b57-488d-9246-ad02e7c43ccb-kube-api-access-c56fk\") pod \"tuned-ttmhq\" (UID: \"8071611a-9b57-488d-9246-ad02e7c43ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.633621 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.633598 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzh9d\" (UniqueName: \"kubernetes.io/projected/a60066e5-252b-4865-879a-0d0d3a6618d4-kube-api-access-lzh9d\") pod \"node-resolver-4sqfs\" (UID: \"a60066e5-252b-4865-879a-0d0d3a6618d4\") " pod="openshift-dns/node-resolver-4sqfs" Apr 22 15:58:41.633716 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.633644 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv6z4\" (UniqueName: \"kubernetes.io/projected/fd3073fe-435c-4974-821b-9229018bf5f4-kube-api-access-kv6z4\") pod \"ovnkube-node-xxznf\" (UID: \"fd3073fe-435c-4974-821b-9229018bf5f4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.633868 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.633849 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgj2h\" (UniqueName: \"kubernetes.io/projected/5756e223-5da3-420b-a640-5e3cdce35004-kube-api-access-qgj2h\") pod \"node-ca-9d6jl\" (UID: \"5756e223-5da3-420b-a640-5e3cdce35004\") " pod="openshift-image-registry/node-ca-9d6jl" Apr 22 15:58:41.724959 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.724915 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xtgg\" (UniqueName: \"kubernetes.io/projected/f4e23e3e-8847-4777-b278-f9b1ef808fd3-kube-api-access-2xtgg\") pod \"iptables-alerter-2q74x\" (UID: \"f4e23e3e-8847-4777-b278-f9b1ef808fd3\") " pod="openshift-network-operator/iptables-alerter-2q74x" Apr 22 15:58:41.725162 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.725032 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f4e23e3e-8847-4777-b278-f9b1ef808fd3-iptables-alerter-script\") pod \"iptables-alerter-2q74x\" (UID: \"f4e23e3e-8847-4777-b278-f9b1ef808fd3\") " pod="openshift-network-operator/iptables-alerter-2q74x" Apr 22 15:58:41.725162 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.725056 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f4e23e3e-8847-4777-b278-f9b1ef808fd3-host-slash\") pod \"iptables-alerter-2q74x\" (UID: \"f4e23e3e-8847-4777-b278-f9b1ef808fd3\") " pod="openshift-network-operator/iptables-alerter-2q74x" Apr 22 15:58:41.725162 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.725123 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f4e23e3e-8847-4777-b278-f9b1ef808fd3-host-slash\") pod \"iptables-alerter-2q74x\" (UID: \"f4e23e3e-8847-4777-b278-f9b1ef808fd3\") " pod="openshift-network-operator/iptables-alerter-2q74x" Apr 22 15:58:41.725683 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.725648 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f4e23e3e-8847-4777-b278-f9b1ef808fd3-iptables-alerter-script\") pod \"iptables-alerter-2q74x\" (UID: \"f4e23e3e-8847-4777-b278-f9b1ef808fd3\") " pod="openshift-network-operator/iptables-alerter-2q74x" Apr 22 15:58:41.733930 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.733902 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xtgg\" (UniqueName: \"kubernetes.io/projected/f4e23e3e-8847-4777-b278-f9b1ef808fd3-kube-api-access-2xtgg\") pod \"iptables-alerter-2q74x\" (UID: \"f4e23e3e-8847-4777-b278-f9b1ef808fd3\") " pod="openshift-network-operator/iptables-alerter-2q74x" Apr 22 15:58:41.810964 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.810925 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nbrr9" Apr 22 15:58:41.819768 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.819741 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9d6jl" Apr 22 15:58:41.827377 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.827356 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" Apr 22 15:58:41.832978 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.832951 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4sqfs" Apr 22 15:58:41.838582 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.838559 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" Apr 22 15:58:41.845237 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.845214 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:58:41.850097 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.850077 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:58:41.851133 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.851116 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4tvm4" Apr 22 15:58:41.859172 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.859151 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cmd9q" Apr 22 15:58:41.864814 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:41.864789 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2q74x" Apr 22 15:58:42.128061 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:42.128021 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs\") pod \"network-metrics-daemon-76x4b\" (UID: \"c090a1ee-5091-44d6-9e1b-65bf4dc8b1be\") " pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:58:42.128278 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:42.128155 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:42.128278 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:42.128248 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs podName:c090a1ee-5091-44d6-9e1b-65bf4dc8b1be nodeName:}" failed. No retries permitted until 2026-04-22 15:58:43.128229094 +0000 UTC m=+4.027783157 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs") pod "network-metrics-daemon-76x4b" (UID: "c090a1ee-5091-44d6-9e1b-65bf4dc8b1be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:42.205708 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:42.205671 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8071611a_9b57_488d_9246_ad02e7c43ccb.slice/crio-ece6feaff2c505d5944b76ebbe11c030d49b3e4d6f1a812d5ce9c6af141e58f0 WatchSource:0}: Error finding container ece6feaff2c505d5944b76ebbe11c030d49b3e4d6f1a812d5ce9c6af141e58f0: Status 404 returned error can't find the container with id ece6feaff2c505d5944b76ebbe11c030d49b3e4d6f1a812d5ce9c6af141e58f0 Apr 22 15:58:42.207661 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:42.207565 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b05e334_2590_45f8_bdfc_e5f6d56bfea3.slice/crio-d0d0b6863e861a6a9d4f62e069376212f4a00f77f0fd217e8a34f0f3dfd0a458 WatchSource:0}: Error finding container d0d0b6863e861a6a9d4f62e069376212f4a00f77f0fd217e8a34f0f3dfd0a458: Status 404 returned error can't find the container with id d0d0b6863e861a6a9d4f62e069376212f4a00f77f0fd217e8a34f0f3dfd0a458 Apr 22 15:58:42.212550 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:42.212480 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4e23e3e_8847_4777_b278_f9b1ef808fd3.slice/crio-e51a01934f0b79c1f75d18b5b0fb6aab83ae55883ee488a5ee2812ca365305ad WatchSource:0}: Error finding container e51a01934f0b79c1f75d18b5b0fb6aab83ae55883ee488a5ee2812ca365305ad: Status 404 returned error can't find the container with id e51a01934f0b79c1f75d18b5b0fb6aab83ae55883ee488a5ee2812ca365305ad Apr 22 15:58:42.213949 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:42.213922 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda60066e5_252b_4865_879a_0d0d3a6618d4.slice/crio-d14ab964eab0742988454b2c3699a7d655d1da7efc074684c8fbdc87e874b671 WatchSource:0}: Error finding container d14ab964eab0742988454b2c3699a7d655d1da7efc074684c8fbdc87e874b671: Status 404 returned error can't find the container with id d14ab964eab0742988454b2c3699a7d655d1da7efc074684c8fbdc87e874b671 Apr 22 15:58:42.214336 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:42.214302 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1b11795_9e34_41fd_9198_cc57fa3cfbf7.slice/crio-7a398099c9906d0a8fb878e0f82b706b408bf3b4cfa7df1069908bbbb98dbcf1 WatchSource:0}: Error finding container 7a398099c9906d0a8fb878e0f82b706b408bf3b4cfa7df1069908bbbb98dbcf1: Status 404 returned error can't find the container with id 7a398099c9906d0a8fb878e0f82b706b408bf3b4cfa7df1069908bbbb98dbcf1 Apr 22 15:58:42.215594 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:42.215479 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ebd4e04_7111_4378_9b6d_f2d25a0e4642.slice/crio-0170ae998df0cabc5377fdd1c6d430dc093d6a41f66c58f8b998d05d920976d2 WatchSource:0}: Error finding container 0170ae998df0cabc5377fdd1c6d430dc093d6a41f66c58f8b998d05d920976d2: Status 404 returned error can't find the container with id 0170ae998df0cabc5377fdd1c6d430dc093d6a41f66c58f8b998d05d920976d2 Apr 22 15:58:42.216594 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:42.216576 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5756e223_5da3_420b_a640_5e3cdce35004.slice/crio-3472ac2f51fe3faa60453dfccadc8cebb858fe45d3e6d0deefb7462f46b4d2c6 WatchSource:0}: Error finding container 3472ac2f51fe3faa60453dfccadc8cebb858fe45d3e6d0deefb7462f46b4d2c6: Status 404 returned error can't find the container with id 3472ac2f51fe3faa60453dfccadc8cebb858fe45d3e6d0deefb7462f46b4d2c6 Apr 22 15:58:42.217693 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:42.217630 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod757bc440_2a2c_42f8_8e5d_03be90e55484.slice/crio-cfc5250a8929097bb5fbc9f442ab5a05d64ac2a5153479e718688a3d29c54cf4 WatchSource:0}: Error finding container cfc5250a8929097bb5fbc9f442ab5a05d64ac2a5153479e718688a3d29c54cf4: Status 404 returned error can't find the container with id cfc5250a8929097bb5fbc9f442ab5a05d64ac2a5153479e718688a3d29c54cf4 Apr 22 15:58:42.221617 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:58:42.221595 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd3073fe_435c_4974_821b_9229018bf5f4.slice/crio-148d852b98183f1b209f99fcfec1545753b486ba62021cf794919826aadf34ac WatchSource:0}: Error finding container 148d852b98183f1b209f99fcfec1545753b486ba62021cf794919826aadf34ac: Status 404 returned error can't find the container with id 148d852b98183f1b209f99fcfec1545753b486ba62021cf794919826aadf34ac Apr 22 15:58:42.229036 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:42.229009 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7m8g\" (UniqueName: \"kubernetes.io/projected/462dda43-d18b-4f55-b5d0-d9b9cbbb2e60-kube-api-access-x7m8g\") pod \"network-check-target-dqwgt\" (UID: \"462dda43-d18b-4f55-b5d0-d9b9cbbb2e60\") " pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:58:42.229169 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:42.229154 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:58:42.229236 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:42.229177 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:58:42.229236 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:42.229207 2572 projected.go:194] Error preparing data for projected volume kube-api-access-x7m8g for pod openshift-network-diagnostics/network-check-target-dqwgt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:42.229354 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:42.229262 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/462dda43-d18b-4f55-b5d0-d9b9cbbb2e60-kube-api-access-x7m8g podName:462dda43-d18b-4f55-b5d0-d9b9cbbb2e60 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:43.229243472 +0000 UTC m=+4.128797539 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-x7m8g" (UniqueName: "kubernetes.io/projected/462dda43-d18b-4f55-b5d0-d9b9cbbb2e60-kube-api-access-x7m8g") pod "network-check-target-dqwgt" (UID: "462dda43-d18b-4f55-b5d0-d9b9cbbb2e60") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:42.559908 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:42.559637 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 15:53:40 +0000 UTC" deadline="2027-09-25 16:08:30.537823906 +0000 UTC" Apr 22 15:58:42.559908 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:42.559849 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12504h9m47.977980133s" Apr 22 15:58:42.647652 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:42.647605 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2q74x" event={"ID":"f4e23e3e-8847-4777-b278-f9b1ef808fd3","Type":"ContainerStarted","Data":"e51a01934f0b79c1f75d18b5b0fb6aab83ae55883ee488a5ee2812ca365305ad"} Apr 22 15:58:42.649447 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:42.649417 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" event={"ID":"0b05e334-2590-45f8-bdfc-e5f6d56bfea3","Type":"ContainerStarted","Data":"d0d0b6863e861a6a9d4f62e069376212f4a00f77f0fd217e8a34f0f3dfd0a458"} Apr 22 15:58:42.651502 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:42.651471 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-9.ec2.internal" event={"ID":"4bba7b470f499a19673a3db16932ed93","Type":"ContainerStarted","Data":"d7228021e80c2cd9ff5e1279338385b9e29472ce293db01c219feb2b3f05bd49"} Apr 22 15:58:42.655904 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:42.655320 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" event={"ID":"fd3073fe-435c-4974-821b-9229018bf5f4","Type":"ContainerStarted","Data":"148d852b98183f1b209f99fcfec1545753b486ba62021cf794919826aadf34ac"} Apr 22 15:58:42.657180 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:42.657135 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9d6jl" event={"ID":"5756e223-5da3-420b-a640-5e3cdce35004","Type":"ContainerStarted","Data":"3472ac2f51fe3faa60453dfccadc8cebb858fe45d3e6d0deefb7462f46b4d2c6"} Apr 22 15:58:42.661051 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:42.660264 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nbrr9" event={"ID":"9ebd4e04-7111-4378-9b6d-f2d25a0e4642","Type":"ContainerStarted","Data":"0170ae998df0cabc5377fdd1c6d430dc093d6a41f66c58f8b998d05d920976d2"} Apr 22 15:58:42.668803 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:42.668747 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4sqfs" event={"ID":"a60066e5-252b-4865-879a-0d0d3a6618d4","Type":"ContainerStarted","Data":"d14ab964eab0742988454b2c3699a7d655d1da7efc074684c8fbdc87e874b671"} Apr 22 15:58:42.671994 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:42.671859 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" event={"ID":"8071611a-9b57-488d-9246-ad02e7c43ccb","Type":"ContainerStarted","Data":"ece6feaff2c505d5944b76ebbe11c030d49b3e4d6f1a812d5ce9c6af141e58f0"} Apr 22 15:58:42.676394 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:42.676356 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmd9q" event={"ID":"757bc440-2a2c-42f8-8e5d-03be90e55484","Type":"ContainerStarted","Data":"cfc5250a8929097bb5fbc9f442ab5a05d64ac2a5153479e718688a3d29c54cf4"} Apr 22 15:58:42.678099 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:42.678069 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4tvm4" event={"ID":"a1b11795-9e34-41fd-9198-cc57fa3cfbf7","Type":"ContainerStarted","Data":"7a398099c9906d0a8fb878e0f82b706b408bf3b4cfa7df1069908bbbb98dbcf1"} Apr 22 15:58:43.134114 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:43.134031 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs\") pod \"network-metrics-daemon-76x4b\" (UID: \"c090a1ee-5091-44d6-9e1b-65bf4dc8b1be\") " pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:58:43.134303 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:43.134240 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:43.134366 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:43.134309 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs podName:c090a1ee-5091-44d6-9e1b-65bf4dc8b1be nodeName:}" failed. No retries permitted until 2026-04-22 15:58:45.134289638 +0000 UTC m=+6.033843702 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs") pod "network-metrics-daemon-76x4b" (UID: "c090a1ee-5091-44d6-9e1b-65bf4dc8b1be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:43.234788 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:43.234729 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7m8g\" (UniqueName: \"kubernetes.io/projected/462dda43-d18b-4f55-b5d0-d9b9cbbb2e60-kube-api-access-x7m8g\") pod \"network-check-target-dqwgt\" (UID: \"462dda43-d18b-4f55-b5d0-d9b9cbbb2e60\") " pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:58:43.234970 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:43.234924 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:58:43.234970 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:43.234942 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:58:43.234970 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:43.234956 2572 projected.go:194] Error preparing data for projected volume kube-api-access-x7m8g for pod openshift-network-diagnostics/network-check-target-dqwgt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:43.235113 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:43.235020 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/462dda43-d18b-4f55-b5d0-d9b9cbbb2e60-kube-api-access-x7m8g podName:462dda43-d18b-4f55-b5d0-d9b9cbbb2e60 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:45.234998223 +0000 UTC m=+6.134552277 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-x7m8g" (UniqueName: "kubernetes.io/projected/462dda43-d18b-4f55-b5d0-d9b9cbbb2e60-kube-api-access-x7m8g") pod "network-check-target-dqwgt" (UID: "462dda43-d18b-4f55-b5d0-d9b9cbbb2e60") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:43.557151 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:43.557035 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-9.ec2.internal" podStartSLOduration=2.557012513 podStartE2EDuration="2.557012513s" podCreationTimestamp="2026-04-22 15:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:58:42.664450212 +0000 UTC m=+3.564004284" watchObservedRunningTime="2026-04-22 15:58:43.557012513 +0000 UTC m=+4.456566577" Apr 22 15:58:43.557574 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:43.557551 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-rs5cd"] Apr 22 15:58:43.560844 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:43.560822 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:58:43.561238 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:43.560908 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rs5cd" podUID="c2282f87-8976-4573-9b9c-d12f85477077" Apr 22 15:58:43.641072 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:43.640032 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:58:43.641072 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:43.640175 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-76x4b" podUID="c090a1ee-5091-44d6-9e1b-65bf4dc8b1be" Apr 22 15:58:43.641072 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:43.640587 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:58:43.641072 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:43.640675 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dqwgt" podUID="462dda43-d18b-4f55-b5d0-d9b9cbbb2e60" Apr 22 15:58:43.641072 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:43.640803 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c2282f87-8976-4573-9b9c-d12f85477077-kubelet-config\") pod \"global-pull-secret-syncer-rs5cd\" (UID: \"c2282f87-8976-4573-9b9c-d12f85477077\") " pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:58:43.641072 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:43.640857 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c2282f87-8976-4573-9b9c-d12f85477077-dbus\") pod \"global-pull-secret-syncer-rs5cd\" (UID: \"c2282f87-8976-4573-9b9c-d12f85477077\") " pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:58:43.641072 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:43.640889 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c2282f87-8976-4573-9b9c-d12f85477077-original-pull-secret\") pod \"global-pull-secret-syncer-rs5cd\" (UID: \"c2282f87-8976-4573-9b9c-d12f85477077\") " pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:58:43.688802 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:43.688762 2572 generic.go:358] "Generic (PLEG): container finished" podID="c011865a41651b31b930def52edfe788" containerID="61935f78708dafdd552a61ab825be1013a86705fb2675d34dfd3c8428815063b" exitCode=0 Apr 22 15:58:43.688986 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:43.688942 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-9.ec2.internal" event={"ID":"c011865a41651b31b930def52edfe788","Type":"ContainerDied","Data":"61935f78708dafdd552a61ab825be1013a86705fb2675d34dfd3c8428815063b"} Apr 22 15:58:43.741732 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:43.741669 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c2282f87-8976-4573-9b9c-d12f85477077-original-pull-secret\") pod \"global-pull-secret-syncer-rs5cd\" (UID: \"c2282f87-8976-4573-9b9c-d12f85477077\") " pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:58:43.741809 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:43.741761 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c2282f87-8976-4573-9b9c-d12f85477077-kubelet-config\") pod \"global-pull-secret-syncer-rs5cd\" (UID: \"c2282f87-8976-4573-9b9c-d12f85477077\") " pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:58:43.741864 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:43.741803 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c2282f87-8976-4573-9b9c-d12f85477077-dbus\") pod \"global-pull-secret-syncer-rs5cd\" (UID: \"c2282f87-8976-4573-9b9c-d12f85477077\") " pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:58:43.742055 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:43.742036 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c2282f87-8976-4573-9b9c-d12f85477077-dbus\") pod \"global-pull-secret-syncer-rs5cd\" (UID: \"c2282f87-8976-4573-9b9c-d12f85477077\") " pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:58:43.742173 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:43.742158 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:58:43.742257 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:43.742245 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2282f87-8976-4573-9b9c-d12f85477077-original-pull-secret podName:c2282f87-8976-4573-9b9c-d12f85477077 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:44.242218279 +0000 UTC m=+5.141772344 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c2282f87-8976-4573-9b9c-d12f85477077-original-pull-secret") pod "global-pull-secret-syncer-rs5cd" (UID: "c2282f87-8976-4573-9b9c-d12f85477077") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:58:43.742328 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:43.742298 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c2282f87-8976-4573-9b9c-d12f85477077-kubelet-config\") pod \"global-pull-secret-syncer-rs5cd\" (UID: \"c2282f87-8976-4573-9b9c-d12f85477077\") " pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:58:44.246546 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:44.246430 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c2282f87-8976-4573-9b9c-d12f85477077-original-pull-secret\") pod \"global-pull-secret-syncer-rs5cd\" (UID: \"c2282f87-8976-4573-9b9c-d12f85477077\") " pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:58:44.246717 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:44.246599 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:58:44.246717 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:44.246664 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2282f87-8976-4573-9b9c-d12f85477077-original-pull-secret podName:c2282f87-8976-4573-9b9c-d12f85477077 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:45.246644272 +0000 UTC m=+6.146198326 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c2282f87-8976-4573-9b9c-d12f85477077-original-pull-secret") pod "global-pull-secret-syncer-rs5cd" (UID: "c2282f87-8976-4573-9b9c-d12f85477077") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:58:44.697537 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:44.697495 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-9.ec2.internal" event={"ID":"c011865a41651b31b930def52edfe788","Type":"ContainerStarted","Data":"4c599bd0f4d2753b53fd421c355d030ddf05e34a967721978f5240aae0206d8b"} Apr 22 15:58:45.155912 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:45.155871 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs\") pod \"network-metrics-daemon-76x4b\" (UID: \"c090a1ee-5091-44d6-9e1b-65bf4dc8b1be\") " pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:58:45.156104 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:45.156006 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:45.156104 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:45.156068 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs podName:c090a1ee-5091-44d6-9e1b-65bf4dc8b1be nodeName:}" failed. No retries permitted until 2026-04-22 15:58:49.156049213 +0000 UTC m=+10.055603263 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs") pod "network-metrics-daemon-76x4b" (UID: "c090a1ee-5091-44d6-9e1b-65bf4dc8b1be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:45.257087 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:45.257046 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c2282f87-8976-4573-9b9c-d12f85477077-original-pull-secret\") pod \"global-pull-secret-syncer-rs5cd\" (UID: \"c2282f87-8976-4573-9b9c-d12f85477077\") " pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:58:45.257258 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:45.257140 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7m8g\" (UniqueName: \"kubernetes.io/projected/462dda43-d18b-4f55-b5d0-d9b9cbbb2e60-kube-api-access-x7m8g\") pod \"network-check-target-dqwgt\" (UID: \"462dda43-d18b-4f55-b5d0-d9b9cbbb2e60\") " pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:58:45.257351 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:45.257328 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:58:45.257422 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:45.257354 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:58:45.257422 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:45.257367 2572 projected.go:194] Error preparing data for projected volume kube-api-access-x7m8g for pod openshift-network-diagnostics/network-check-target-dqwgt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:45.257422 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:45.257390 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:58:45.257576 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:45.257435 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/462dda43-d18b-4f55-b5d0-d9b9cbbb2e60-kube-api-access-x7m8g podName:462dda43-d18b-4f55-b5d0-d9b9cbbb2e60 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:49.25741565 +0000 UTC m=+10.156969712 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-x7m8g" (UniqueName: "kubernetes.io/projected/462dda43-d18b-4f55-b5d0-d9b9cbbb2e60-kube-api-access-x7m8g") pod "network-check-target-dqwgt" (UID: "462dda43-d18b-4f55-b5d0-d9b9cbbb2e60") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:45.257576 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:45.257463 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2282f87-8976-4573-9b9c-d12f85477077-original-pull-secret podName:c2282f87-8976-4573-9b9c-d12f85477077 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:47.257444556 +0000 UTC m=+8.156998621 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c2282f87-8976-4573-9b9c-d12f85477077-original-pull-secret") pod "global-pull-secret-syncer-rs5cd" (UID: "c2282f87-8976-4573-9b9c-d12f85477077") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:58:45.640976 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:45.640525 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:58:45.640976 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:45.640525 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:58:45.640976 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:45.640677 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dqwgt" podUID="462dda43-d18b-4f55-b5d0-d9b9cbbb2e60" Apr 22 15:58:45.640976 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:45.640713 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-76x4b" podUID="c090a1ee-5091-44d6-9e1b-65bf4dc8b1be" Apr 22 15:58:45.640976 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:45.640770 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:58:45.640976 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:45.640898 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rs5cd" podUID="c2282f87-8976-4573-9b9c-d12f85477077" Apr 22 15:58:47.275538 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:47.275492 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c2282f87-8976-4573-9b9c-d12f85477077-original-pull-secret\") pod \"global-pull-secret-syncer-rs5cd\" (UID: \"c2282f87-8976-4573-9b9c-d12f85477077\") " pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:58:47.275949 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:47.275673 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:58:47.275949 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:47.275737 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2282f87-8976-4573-9b9c-d12f85477077-original-pull-secret podName:c2282f87-8976-4573-9b9c-d12f85477077 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:51.275717923 +0000 UTC m=+12.175271976 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c2282f87-8976-4573-9b9c-d12f85477077-original-pull-secret") pod "global-pull-secret-syncer-rs5cd" (UID: "c2282f87-8976-4573-9b9c-d12f85477077") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:58:47.640631 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:47.640592 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:58:47.640799 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:47.640725 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rs5cd" podUID="c2282f87-8976-4573-9b9c-d12f85477077" Apr 22 15:58:47.643820 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:47.641219 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:58:47.643820 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:47.641341 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-76x4b" podUID="c090a1ee-5091-44d6-9e1b-65bf4dc8b1be" Apr 22 15:58:47.643820 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:47.641419 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:58:47.643820 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:47.641485 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dqwgt" podUID="462dda43-d18b-4f55-b5d0-d9b9cbbb2e60" Apr 22 15:58:49.193765 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:49.193256 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs\") pod \"network-metrics-daemon-76x4b\" (UID: \"c090a1ee-5091-44d6-9e1b-65bf4dc8b1be\") " pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:58:49.193765 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:49.193417 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:49.193765 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:49.193472 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs podName:c090a1ee-5091-44d6-9e1b-65bf4dc8b1be nodeName:}" failed. No retries permitted until 2026-04-22 15:58:57.193458436 +0000 UTC m=+18.093012485 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs") pod "network-metrics-daemon-76x4b" (UID: "c090a1ee-5091-44d6-9e1b-65bf4dc8b1be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:49.293964 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:49.293816 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7m8g\" (UniqueName: \"kubernetes.io/projected/462dda43-d18b-4f55-b5d0-d9b9cbbb2e60-kube-api-access-x7m8g\") pod \"network-check-target-dqwgt\" (UID: \"462dda43-d18b-4f55-b5d0-d9b9cbbb2e60\") " pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:58:49.294145 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:49.294043 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:58:49.294145 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:49.294069 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:58:49.294145 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:49.294083 2572 projected.go:194] Error preparing data for projected volume kube-api-access-x7m8g for pod openshift-network-diagnostics/network-check-target-dqwgt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:49.294145 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:49.294144 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/462dda43-d18b-4f55-b5d0-d9b9cbbb2e60-kube-api-access-x7m8g podName:462dda43-d18b-4f55-b5d0-d9b9cbbb2e60 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:57.294127326 +0000 UTC m=+18.193681391 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-x7m8g" (UniqueName: "kubernetes.io/projected/462dda43-d18b-4f55-b5d0-d9b9cbbb2e60-kube-api-access-x7m8g") pod "network-check-target-dqwgt" (UID: "462dda43-d18b-4f55-b5d0-d9b9cbbb2e60") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:49.642920 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:49.642886 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:58:49.643097 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:49.642886 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:58:49.643097 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:49.643015 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rs5cd" podUID="c2282f87-8976-4573-9b9c-d12f85477077" Apr 22 15:58:49.643097 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:49.643081 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dqwgt" podUID="462dda43-d18b-4f55-b5d0-d9b9cbbb2e60" Apr 22 15:58:49.643322 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:49.642891 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:58:49.643322 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:49.643180 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-76x4b" podUID="c090a1ee-5091-44d6-9e1b-65bf4dc8b1be" Apr 22 15:58:51.308311 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:51.308266 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c2282f87-8976-4573-9b9c-d12f85477077-original-pull-secret\") pod \"global-pull-secret-syncer-rs5cd\" (UID: \"c2282f87-8976-4573-9b9c-d12f85477077\") " pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:58:51.308751 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:51.308467 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:58:51.308751 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:51.308528 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2282f87-8976-4573-9b9c-d12f85477077-original-pull-secret podName:c2282f87-8976-4573-9b9c-d12f85477077 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:59.308509165 +0000 UTC m=+20.208063227 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c2282f87-8976-4573-9b9c-d12f85477077-original-pull-secret") pod "global-pull-secret-syncer-rs5cd" (UID: "c2282f87-8976-4573-9b9c-d12f85477077") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:58:51.640276 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:51.640245 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:58:51.640429 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:51.640245 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:58:51.640429 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:51.640361 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dqwgt" podUID="462dda43-d18b-4f55-b5d0-d9b9cbbb2e60" Apr 22 15:58:51.640558 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:51.640476 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-76x4b" podUID="c090a1ee-5091-44d6-9e1b-65bf4dc8b1be" Apr 22 15:58:51.640558 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:51.640244 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:58:51.640657 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:51.640554 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rs5cd" podUID="c2282f87-8976-4573-9b9c-d12f85477077" Apr 22 15:58:53.640639 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:53.640603 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:58:53.641047 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:53.640738 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rs5cd" podUID="c2282f87-8976-4573-9b9c-d12f85477077" Apr 22 15:58:53.641047 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:53.640615 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:58:53.641047 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:53.640829 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dqwgt" podUID="462dda43-d18b-4f55-b5d0-d9b9cbbb2e60" Apr 22 15:58:53.641047 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:53.640609 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:58:53.641047 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:53.640917 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-76x4b" podUID="c090a1ee-5091-44d6-9e1b-65bf4dc8b1be" Apr 22 15:58:55.640151 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:55.640108 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:58:55.640151 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:55.640141 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:58:55.640619 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:55.640143 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:58:55.640619 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:55.640259 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dqwgt" podUID="462dda43-d18b-4f55-b5d0-d9b9cbbb2e60" Apr 22 15:58:55.640619 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:55.640356 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-76x4b" podUID="c090a1ee-5091-44d6-9e1b-65bf4dc8b1be" Apr 22 15:58:55.640619 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:55.640450 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rs5cd" podUID="c2282f87-8976-4573-9b9c-d12f85477077" Apr 22 15:58:57.250696 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:57.250658 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs\") pod \"network-metrics-daemon-76x4b\" (UID: \"c090a1ee-5091-44d6-9e1b-65bf4dc8b1be\") " pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:58:57.251235 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:57.250835 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:57.251235 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:57.250919 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs podName:c090a1ee-5091-44d6-9e1b-65bf4dc8b1be nodeName:}" failed. No retries permitted until 2026-04-22 15:59:13.250898068 +0000 UTC m=+34.150452317 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs") pod "network-metrics-daemon-76x4b" (UID: "c090a1ee-5091-44d6-9e1b-65bf4dc8b1be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:57.351836 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:57.351797 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7m8g\" (UniqueName: \"kubernetes.io/projected/462dda43-d18b-4f55-b5d0-d9b9cbbb2e60-kube-api-access-x7m8g\") pod \"network-check-target-dqwgt\" (UID: \"462dda43-d18b-4f55-b5d0-d9b9cbbb2e60\") " pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:58:57.352034 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:57.351998 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:58:57.352034 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:57.352023 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:58:57.352034 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:57.352033 2572 projected.go:194] Error preparing data for projected volume kube-api-access-x7m8g for pod openshift-network-diagnostics/network-check-target-dqwgt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:57.352212 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:57.352092 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/462dda43-d18b-4f55-b5d0-d9b9cbbb2e60-kube-api-access-x7m8g podName:462dda43-d18b-4f55-b5d0-d9b9cbbb2e60 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:13.352075477 +0000 UTC m=+34.251629542 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-x7m8g" (UniqueName: "kubernetes.io/projected/462dda43-d18b-4f55-b5d0-d9b9cbbb2e60-kube-api-access-x7m8g") pod "network-check-target-dqwgt" (UID: "462dda43-d18b-4f55-b5d0-d9b9cbbb2e60") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:57.640539 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:57.640504 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:58:57.640812 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:57.640504 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:58:57.640812 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:57.640623 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dqwgt" podUID="462dda43-d18b-4f55-b5d0-d9b9cbbb2e60" Apr 22 15:58:57.640812 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:57.640724 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-76x4b" podUID="c090a1ee-5091-44d6-9e1b-65bf4dc8b1be" Apr 22 15:58:57.640812 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:57.640504 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:58:57.640993 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:57.640822 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rs5cd" podUID="c2282f87-8976-4573-9b9c-d12f85477077" Apr 22 15:58:59.367448 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:59.367420 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c2282f87-8976-4573-9b9c-d12f85477077-original-pull-secret\") pod \"global-pull-secret-syncer-rs5cd\" (UID: \"c2282f87-8976-4573-9b9c-d12f85477077\") " pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:58:59.367780 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:59.367538 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:58:59.367780 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:59.367593 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2282f87-8976-4573-9b9c-d12f85477077-original-pull-secret podName:c2282f87-8976-4573-9b9c-d12f85477077 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:15.367575544 +0000 UTC m=+36.267129595 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c2282f87-8976-4573-9b9c-d12f85477077-original-pull-secret") pod "global-pull-secret-syncer-rs5cd" (UID: "c2282f87-8976-4573-9b9c-d12f85477077") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:58:59.641042 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:59.640959 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:58:59.641208 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:59.641067 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-76x4b" podUID="c090a1ee-5091-44d6-9e1b-65bf4dc8b1be" Apr 22 15:58:59.641208 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:59.641115 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:58:59.641208 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:59.641174 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dqwgt" podUID="462dda43-d18b-4f55-b5d0-d9b9cbbb2e60" Apr 22 15:58:59.641360 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:58:59.641259 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:58:59.641412 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:58:59.641379 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rs5cd" podUID="c2282f87-8976-4573-9b9c-d12f85477077" Apr 22 15:59:00.729211 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:00.728773 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" event={"ID":"0b05e334-2590-45f8-bdfc-e5f6d56bfea3","Type":"ContainerStarted","Data":"f7aceaf220b88fa838c7abef5c6c50429a1a6ef8e01374c6641edb2e0c46a2e7"} Apr 22 15:59:00.731576 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:00.731547 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxznf_fd3073fe-435c-4974-821b-9229018bf5f4/ovn-acl-logging/0.log" Apr 22 15:59:00.731920 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:00.731900 2572 generic.go:358] "Generic (PLEG): container finished" podID="fd3073fe-435c-4974-821b-9229018bf5f4" containerID="bac0f8ddd6c58c4a2e030af5ebd25fb7a9cbb7666551fb731913a2f4400ffacb" exitCode=1 Apr 22 15:59:00.731996 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:00.731964 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" event={"ID":"fd3073fe-435c-4974-821b-9229018bf5f4","Type":"ContainerStarted","Data":"f9b08bfd9653cd6a73a2d2656a0a4eef7c6870b3df90a23c598284db9393b5f5"} Apr 22 15:59:00.732053 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:00.731999 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" event={"ID":"fd3073fe-435c-4974-821b-9229018bf5f4","Type":"ContainerStarted","Data":"b8f40b772e0d1ff28eeb674e01d0ab52cf8ef2ada2b000aacb1d18b0cf464ce2"} Apr 22 15:59:00.732053 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:00.732011 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" event={"ID":"fd3073fe-435c-4974-821b-9229018bf5f4","Type":"ContainerStarted","Data":"429b8a729822f0a70aac013d9b05cb6fef394a3b6005051d491385c678b13521"} Apr 22 15:59:00.732053 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:00.732018 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" event={"ID":"fd3073fe-435c-4974-821b-9229018bf5f4","Type":"ContainerStarted","Data":"1e12da1880b32972205741b75fdd6615bef72c0e5fe6ffaaf579422693453eb5"} Apr 22 15:59:00.732053 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:00.732029 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" event={"ID":"fd3073fe-435c-4974-821b-9229018bf5f4","Type":"ContainerDied","Data":"bac0f8ddd6c58c4a2e030af5ebd25fb7a9cbb7666551fb731913a2f4400ffacb"} Apr 22 15:59:00.732053 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:00.732044 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" event={"ID":"fd3073fe-435c-4974-821b-9229018bf5f4","Type":"ContainerStarted","Data":"36e0632ccf1f561f6b37f2f5ac10e84903e99b79f010861f12a345a5b2c9510b"} Apr 22 15:59:00.733400 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:00.733369 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9d6jl" event={"ID":"5756e223-5da3-420b-a640-5e3cdce35004","Type":"ContainerStarted","Data":"85d41be381fdead9295ebd7c20324941f83b5d5964c10e26c5d2ca6a5224e5d3"} Apr 22 15:59:00.734866 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:00.734844 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nbrr9" event={"ID":"9ebd4e04-7111-4378-9b6d-f2d25a0e4642","Type":"ContainerStarted","Data":"640b0bcf00a6022682cf7ae4eee9a7c6629439f0ca912b166e07e0a11588ddaf"} Apr 22 15:59:00.736230 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:00.736188 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4sqfs" event={"ID":"a60066e5-252b-4865-879a-0d0d3a6618d4","Type":"ContainerStarted","Data":"7e67f9c2725f7750145ad1b5e5ba0a8cb80ff057a3fefe4bd3fd04be5c026f93"} Apr 22 15:59:00.737556 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:00.737534 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" event={"ID":"8071611a-9b57-488d-9246-ad02e7c43ccb","Type":"ContainerStarted","Data":"4c088fd81301814a2b1be4bddd73b92aa987b292de2a7f66b0d791fc03de2506"} Apr 22 15:59:00.739044 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:00.739022 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmd9q" event={"ID":"757bc440-2a2c-42f8-8e5d-03be90e55484","Type":"ContainerStarted","Data":"377de58bdeacf16a77322e743da42491d0cee6723ece3e9a48968f00f9c72ff6"} Apr 22 15:59:00.741775 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:00.741749 2572 generic.go:358] "Generic (PLEG): container finished" podID="a1b11795-9e34-41fd-9198-cc57fa3cfbf7" containerID="634922450b2ec83b44cbc644ab5596b6adae33a42534c004a04be759b55b5215" exitCode=0 Apr 22 15:59:00.741902 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:00.741787 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4tvm4" event={"ID":"a1b11795-9e34-41fd-9198-cc57fa3cfbf7","Type":"ContainerDied","Data":"634922450b2ec83b44cbc644ab5596b6adae33a42534c004a04be759b55b5215"} Apr 22 15:59:00.745996 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:00.745951 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-9.ec2.internal" podStartSLOduration=19.745937504 podStartE2EDuration="19.745937504s" podCreationTimestamp="2026-04-22 15:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:58:44.712267257 +0000 UTC m=+5.611821329" watchObservedRunningTime="2026-04-22 15:59:00.745937504 +0000 UTC m=+21.645491574" Apr 22 15:59:00.757893 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:00.757849 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9d6jl" podStartSLOduration=4.621094448 podStartE2EDuration="21.757833405s" podCreationTimestamp="2026-04-22 15:58:39 +0000 UTC" firstStartedPulling="2026-04-22 15:58:42.220376832 +0000 UTC m=+3.119930884" lastFinishedPulling="2026-04-22 15:58:59.357115789 +0000 UTC m=+20.256669841" observedRunningTime="2026-04-22 15:59:00.745834842 +0000 UTC m=+21.645388916" watchObservedRunningTime="2026-04-22 15:59:00.757833405 +0000 UTC m=+21.657387547" Apr 22 15:59:00.771062 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:00.771007 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-nbrr9" podStartSLOduration=4.282685756 podStartE2EDuration="21.770994265s" podCreationTimestamp="2026-04-22 15:58:39 +0000 UTC" firstStartedPulling="2026-04-22 15:58:42.220744851 +0000 UTC m=+3.120298909" lastFinishedPulling="2026-04-22 15:58:59.709053357 +0000 UTC m=+20.608607418" observedRunningTime="2026-04-22 15:59:00.757596036 +0000 UTC m=+21.657150106" watchObservedRunningTime="2026-04-22 15:59:00.770994265 +0000 UTC m=+21.670548347" Apr 22 15:59:00.771309 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:00.771276 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4sqfs" podStartSLOduration=4.629733301 podStartE2EDuration="21.771269416s" podCreationTimestamp="2026-04-22 15:58:39 +0000 UTC" firstStartedPulling="2026-04-22 15:58:42.215577868 +0000 UTC m=+3.115131926" lastFinishedPulling="2026-04-22 15:58:59.357113989 +0000 UTC m=+20.256668041" observedRunningTime="2026-04-22 15:59:00.770643815 +0000 UTC m=+21.670197887" watchObservedRunningTime="2026-04-22 15:59:00.771269416 +0000 UTC m=+21.670823512" Apr 22 15:59:00.784843 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:00.784799 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ttmhq" podStartSLOduration=4.256804073 podStartE2EDuration="21.784783818s" podCreationTimestamp="2026-04-22 15:58:39 +0000 UTC" firstStartedPulling="2026-04-22 15:58:42.208831948 +0000 UTC m=+3.108386000" lastFinishedPulling="2026-04-22 15:58:59.736811682 +0000 UTC m=+20.636365745" observedRunningTime="2026-04-22 15:59:00.784577705 +0000 UTC m=+21.684131777" watchObservedRunningTime="2026-04-22 15:59:00.784783818 +0000 UTC m=+21.684337896" Apr 22 15:59:00.799738 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:00.799681 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-cmd9q" podStartSLOduration=4.277566794 podStartE2EDuration="21.79966263s" podCreationTimestamp="2026-04-22 15:58:39 +0000 UTC" firstStartedPulling="2026-04-22 15:58:42.224003816 +0000 UTC m=+3.123557865" lastFinishedPulling="2026-04-22 15:58:59.74609965 +0000 UTC m=+20.645653701" observedRunningTime="2026-04-22 15:59:00.799387763 +0000 UTC m=+21.698941856" watchObservedRunningTime="2026-04-22 15:59:00.79966263 +0000 UTC m=+21.699216703" Apr 22 15:59:01.115042 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:01.114957 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 15:59:01.593175 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:01.593069 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T15:59:01.114985668Z","UUID":"0029929b-4915-42e4-a3f4-dce9e345b700","Handler":null,"Name":"","Endpoint":""} Apr 22 15:59:01.597241 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:01.597215 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 15:59:01.597393 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:01.597248 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 15:59:01.640133 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:01.640077 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:59:01.640133 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:01.640137 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:59:01.640407 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:01.640109 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:59:01.640407 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:01.640251 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rs5cd" podUID="c2282f87-8976-4573-9b9c-d12f85477077" Apr 22 15:59:01.640407 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:01.640392 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dqwgt" podUID="462dda43-d18b-4f55-b5d0-d9b9cbbb2e60" Apr 22 15:59:01.640532 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:01.640472 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-76x4b" podUID="c090a1ee-5091-44d6-9e1b-65bf4dc8b1be" Apr 22 15:59:01.745947 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:01.745550 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2q74x" event={"ID":"f4e23e3e-8847-4777-b278-f9b1ef808fd3","Type":"ContainerStarted","Data":"fe182d904c263c2fadb3a26508ae1039d6d160c1d2ab2bfd4fa969af70df7a29"} Apr 22 15:59:01.748992 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:01.747491 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" event={"ID":"0b05e334-2590-45f8-bdfc-e5f6d56bfea3","Type":"ContainerStarted","Data":"c3f9479e3c3e9cc77bbecabbf7ccf3353a0d9026abe7515422c4d5a1fdf443e5"} Apr 22 15:59:01.758713 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:01.758663 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2q74x" podStartSLOduration=5.233069618 podStartE2EDuration="22.758647502s" podCreationTimestamp="2026-04-22 15:58:39 +0000 UTC" firstStartedPulling="2026-04-22 15:58:42.214901126 +0000 UTC m=+3.114455178" lastFinishedPulling="2026-04-22 15:58:59.740479005 +0000 UTC m=+20.640033062" observedRunningTime="2026-04-22 15:59:01.758299686 +0000 UTC m=+22.657853759" watchObservedRunningTime="2026-04-22 15:59:01.758647502 +0000 UTC m=+22.658201575" Apr 22 15:59:02.753326 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:02.753300 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxznf_fd3073fe-435c-4974-821b-9229018bf5f4/ovn-acl-logging/0.log" Apr 22 15:59:02.753744 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:02.753711 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" event={"ID":"fd3073fe-435c-4974-821b-9229018bf5f4","Type":"ContainerStarted","Data":"ceccf262bde57b9998d2d0a5bcf87261a0cf11193251d4f004c4f954bc6d08d1"} Apr 22 15:59:02.756141 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:02.756102 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" event={"ID":"0b05e334-2590-45f8-bdfc-e5f6d56bfea3","Type":"ContainerStarted","Data":"0f26ca96229b0df04b0816cf68f205964523a7be38edff6ae627b485df158dff"} Apr 22 15:59:02.771620 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:02.771557 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q76xh" podStartSLOduration=3.608423709 podStartE2EDuration="23.771536794s" podCreationTimestamp="2026-04-22 15:58:39 +0000 UTC" firstStartedPulling="2026-04-22 15:58:42.210262519 +0000 UTC m=+3.109816570" lastFinishedPulling="2026-04-22 15:59:02.373375601 +0000 UTC m=+23.272929655" observedRunningTime="2026-04-22 15:59:02.771337074 +0000 UTC m=+23.670891145" watchObservedRunningTime="2026-04-22 15:59:02.771536794 +0000 UTC m=+23.671090868" Apr 22 15:59:03.641121 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:03.640917 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:59:03.641299 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:03.640917 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:59:03.641299 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:03.641245 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-76x4b" podUID="c090a1ee-5091-44d6-9e1b-65bf4dc8b1be" Apr 22 15:59:03.641299 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:03.640929 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:59:03.641456 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:03.641296 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dqwgt" podUID="462dda43-d18b-4f55-b5d0-d9b9cbbb2e60" Apr 22 15:59:03.641456 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:03.641380 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rs5cd" podUID="c2282f87-8976-4573-9b9c-d12f85477077" Apr 22 15:59:05.640590 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:05.640344 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:59:05.641341 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:05.640356 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:59:05.641341 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:05.640683 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dqwgt" podUID="462dda43-d18b-4f55-b5d0-d9b9cbbb2e60" Apr 22 15:59:05.641341 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:05.640356 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:59:05.641341 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:05.640762 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-76x4b" podUID="c090a1ee-5091-44d6-9e1b-65bf4dc8b1be" Apr 22 15:59:05.641341 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:05.640817 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rs5cd" podUID="c2282f87-8976-4573-9b9c-d12f85477077" Apr 22 15:59:05.735088 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:05.735057 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-nbrr9" Apr 22 15:59:05.735663 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:05.735646 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-nbrr9" Apr 22 15:59:05.763392 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:05.763359 2572 generic.go:358] "Generic (PLEG): container finished" podID="a1b11795-9e34-41fd-9198-cc57fa3cfbf7" containerID="72d7e214145b1980150af7b81d20e4e6a3eaed2d98bed57d47a5117038f86d10" exitCode=0 Apr 22 15:59:05.763568 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:05.763447 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4tvm4" event={"ID":"a1b11795-9e34-41fd-9198-cc57fa3cfbf7","Type":"ContainerDied","Data":"72d7e214145b1980150af7b81d20e4e6a3eaed2d98bed57d47a5117038f86d10"} Apr 22 15:59:05.766861 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:05.766841 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxznf_fd3073fe-435c-4974-821b-9229018bf5f4/ovn-acl-logging/0.log" Apr 22 15:59:05.767247 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:05.767228 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" event={"ID":"fd3073fe-435c-4974-821b-9229018bf5f4","Type":"ContainerStarted","Data":"be4299b4a13c3ad3bf2441b8d8cd0dcebb25ded49c79e5f7d7010ffc2f042bdb"} Apr 22 15:59:05.767483 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:05.767465 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:59:05.767560 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:05.767493 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:59:05.767689 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:05.767671 2572 scope.go:117] "RemoveContainer" containerID="bac0f8ddd6c58c4a2e030af5ebd25fb7a9cbb7666551fb731913a2f4400ffacb" Apr 22 15:59:05.785034 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:05.785005 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:59:06.707748 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:06.707647 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rs5cd"] Apr 22 15:59:06.708239 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:06.707795 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:59:06.708239 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:06.707908 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rs5cd" podUID="c2282f87-8976-4573-9b9c-d12f85477077" Apr 22 15:59:06.708369 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:06.708300 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dqwgt"] Apr 22 15:59:06.708424 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:06.708414 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:59:06.708532 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:06.708511 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dqwgt" podUID="462dda43-d18b-4f55-b5d0-d9b9cbbb2e60" Apr 22 15:59:06.709019 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:06.708998 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-76x4b"] Apr 22 15:59:06.709127 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:06.709113 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:59:06.709313 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:06.709270 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-76x4b" podUID="c090a1ee-5091-44d6-9e1b-65bf4dc8b1be" Apr 22 15:59:06.770877 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:06.770844 2572 generic.go:358] "Generic (PLEG): container finished" podID="a1b11795-9e34-41fd-9198-cc57fa3cfbf7" containerID="f489b5347a4481bc932a35978258357e5e86ceda9605b55abfbc890d03a89343" exitCode=0 Apr 22 15:59:06.771059 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:06.770934 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4tvm4" event={"ID":"a1b11795-9e34-41fd-9198-cc57fa3cfbf7","Type":"ContainerDied","Data":"f489b5347a4481bc932a35978258357e5e86ceda9605b55abfbc890d03a89343"} Apr 22 15:59:06.774574 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:06.774555 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxznf_fd3073fe-435c-4974-821b-9229018bf5f4/ovn-acl-logging/0.log" Apr 22 15:59:06.774897 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:06.774876 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" event={"ID":"fd3073fe-435c-4974-821b-9229018bf5f4","Type":"ContainerStarted","Data":"ca05f341d0ee24b3512f62b0271d2d292074aee2dea71e55e8e8b6e8a056e17f"} Apr 22 15:59:06.775243 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:06.775227 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:59:06.790817 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:06.790792 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:59:06.811317 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:06.811271 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" podStartSLOduration=10.252608406 podStartE2EDuration="27.811257633s" podCreationTimestamp="2026-04-22 15:58:39 +0000 UTC" firstStartedPulling="2026-04-22 15:58:42.223113392 +0000 UTC m=+3.122667441" lastFinishedPulling="2026-04-22 15:58:59.781762618 +0000 UTC m=+20.681316668" observedRunningTime="2026-04-22 15:59:06.810264558 +0000 UTC m=+27.709818628" watchObservedRunningTime="2026-04-22 15:59:06.811257633 +0000 UTC m=+27.710811681" Apr 22 15:59:07.778752 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:07.778663 2572 generic.go:358] "Generic (PLEG): container finished" podID="a1b11795-9e34-41fd-9198-cc57fa3cfbf7" containerID="a73863ef05ad599997093ba87f1a1a0e5cc62fafdb63e6d6494a30a384c4eb13" exitCode=0 Apr 22 15:59:07.779213 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:07.778782 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4tvm4" event={"ID":"a1b11795-9e34-41fd-9198-cc57fa3cfbf7","Type":"ContainerDied","Data":"a73863ef05ad599997093ba87f1a1a0e5cc62fafdb63e6d6494a30a384c4eb13"} Apr 22 15:59:08.640598 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:08.640562 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:59:08.640598 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:08.640591 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:59:08.640832 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:08.640669 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-76x4b" podUID="c090a1ee-5091-44d6-9e1b-65bf4dc8b1be" Apr 22 15:59:08.640832 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:08.640731 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:59:08.640933 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:08.640839 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dqwgt" podUID="462dda43-d18b-4f55-b5d0-d9b9cbbb2e60" Apr 22 15:59:08.641007 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:08.640966 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rs5cd" podUID="c2282f87-8976-4573-9b9c-d12f85477077" Apr 22 15:59:09.760691 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:09.760469 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-nbrr9" Apr 22 15:59:09.761181 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:09.760820 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 15:59:09.761181 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:09.761053 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-nbrr9" Apr 22 15:59:10.640950 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:10.640875 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:59:10.640950 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:10.640899 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:59:10.641162 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:10.641012 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dqwgt" podUID="462dda43-d18b-4f55-b5d0-d9b9cbbb2e60" Apr 22 15:59:10.641162 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:10.641049 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:59:10.641310 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:10.641241 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-76x4b" podUID="c090a1ee-5091-44d6-9e1b-65bf4dc8b1be" Apr 22 15:59:10.641345 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:10.641329 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rs5cd" podUID="c2282f87-8976-4573-9b9c-d12f85477077" Apr 22 15:59:12.640915 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:12.640873 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:59:12.640915 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:12.640899 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:59:12.641553 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:12.640992 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:59:12.641553 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:12.641011 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dqwgt" podUID="462dda43-d18b-4f55-b5d0-d9b9cbbb2e60" Apr 22 15:59:12.641553 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:12.641105 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-76x4b" podUID="c090a1ee-5091-44d6-9e1b-65bf4dc8b1be" Apr 22 15:59:12.641553 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:12.641211 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rs5cd" podUID="c2282f87-8976-4573-9b9c-d12f85477077" Apr 22 15:59:12.949718 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:12.949687 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-9.ec2.internal" event="NodeReady" Apr 22 15:59:12.949914 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:12.949839 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 15:59:12.988961 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:12.988931 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-nfknt"] Apr 22 15:59:12.994731 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:12.994702 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-pcv67"] Apr 22 15:59:12.994895 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:12.994876 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nfknt" Apr 22 15:59:12.997819 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:12.997283 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 15:59:12.997819 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:12.997382 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 15:59:12.997819 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:12.997574 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-lm8hj\"" Apr 22 15:59:12.999048 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:12.998685 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pcv67" Apr 22 15:59:13.000562 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.000539 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 15:59:13.000893 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.000877 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 15:59:13.001042 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.001020 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 15:59:13.001171 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.001156 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sbbp8\"" Apr 22 15:59:13.001259 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.001175 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nfknt"] Apr 22 15:59:13.019413 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.018733 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pcv67"] Apr 22 15:59:13.070683 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.070641 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xndvs\" (UniqueName: \"kubernetes.io/projected/11c29039-8c35-465e-8df3-408a688c08ed-kube-api-access-xndvs\") pod \"ingress-canary-pcv67\" (UID: \"11c29039-8c35-465e-8df3-408a688c08ed\") " pod="openshift-ingress-canary/ingress-canary-pcv67" Apr 22 15:59:13.070874 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.070693 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2e728dc-359a-4e6e-831e-f1b83f015c97-config-volume\") pod \"dns-default-nfknt\" (UID: \"f2e728dc-359a-4e6e-831e-f1b83f015c97\") " pod="openshift-dns/dns-default-nfknt" Apr 22 15:59:13.070874 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.070743 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f2e728dc-359a-4e6e-831e-f1b83f015c97-tmp-dir\") pod \"dns-default-nfknt\" (UID: \"f2e728dc-359a-4e6e-831e-f1b83f015c97\") " pod="openshift-dns/dns-default-nfknt" Apr 22 15:59:13.070874 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.070771 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2e728dc-359a-4e6e-831e-f1b83f015c97-metrics-tls\") pod \"dns-default-nfknt\" (UID: \"f2e728dc-359a-4e6e-831e-f1b83f015c97\") " pod="openshift-dns/dns-default-nfknt" Apr 22 15:59:13.070874 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.070813 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq456\" (UniqueName: \"kubernetes.io/projected/f2e728dc-359a-4e6e-831e-f1b83f015c97-kube-api-access-jq456\") pod \"dns-default-nfknt\" (UID: \"f2e728dc-359a-4e6e-831e-f1b83f015c97\") " pod="openshift-dns/dns-default-nfknt" Apr 22 15:59:13.070874 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.070859 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11c29039-8c35-465e-8df3-408a688c08ed-cert\") pod \"ingress-canary-pcv67\" (UID: \"11c29039-8c35-465e-8df3-408a688c08ed\") " pod="openshift-ingress-canary/ingress-canary-pcv67" Apr 22 15:59:13.171968 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.171929 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11c29039-8c35-465e-8df3-408a688c08ed-cert\") pod \"ingress-canary-pcv67\" (UID: \"11c29039-8c35-465e-8df3-408a688c08ed\") " pod="openshift-ingress-canary/ingress-canary-pcv67" Apr 22 15:59:13.172156 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.171986 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xndvs\" (UniqueName: \"kubernetes.io/projected/11c29039-8c35-465e-8df3-408a688c08ed-kube-api-access-xndvs\") pod \"ingress-canary-pcv67\" (UID: \"11c29039-8c35-465e-8df3-408a688c08ed\") " pod="openshift-ingress-canary/ingress-canary-pcv67" Apr 22 15:59:13.172156 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.172014 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2e728dc-359a-4e6e-831e-f1b83f015c97-config-volume\") pod \"dns-default-nfknt\" (UID: \"f2e728dc-359a-4e6e-831e-f1b83f015c97\") " pod="openshift-dns/dns-default-nfknt" Apr 22 15:59:13.172156 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.172050 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f2e728dc-359a-4e6e-831e-f1b83f015c97-tmp-dir\") pod \"dns-default-nfknt\" (UID: \"f2e728dc-359a-4e6e-831e-f1b83f015c97\") " pod="openshift-dns/dns-default-nfknt" Apr 22 15:59:13.172156 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.172102 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2e728dc-359a-4e6e-831e-f1b83f015c97-metrics-tls\") pod \"dns-default-nfknt\" (UID: \"f2e728dc-359a-4e6e-831e-f1b83f015c97\") " pod="openshift-dns/dns-default-nfknt" Apr 22 15:59:13.172156 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:13.172106 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:59:13.172156 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.172146 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq456\" (UniqueName: \"kubernetes.io/projected/f2e728dc-359a-4e6e-831e-f1b83f015c97-kube-api-access-jq456\") pod \"dns-default-nfknt\" (UID: \"f2e728dc-359a-4e6e-831e-f1b83f015c97\") " pod="openshift-dns/dns-default-nfknt" Apr 22 15:59:13.172506 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:13.172181 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11c29039-8c35-465e-8df3-408a688c08ed-cert podName:11c29039-8c35-465e-8df3-408a688c08ed nodeName:}" failed. No retries permitted until 2026-04-22 15:59:13.672161924 +0000 UTC m=+34.571715977 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/11c29039-8c35-465e-8df3-408a688c08ed-cert") pod "ingress-canary-pcv67" (UID: "11c29039-8c35-465e-8df3-408a688c08ed") : secret "canary-serving-cert" not found Apr 22 15:59:13.172506 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:13.172377 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:59:13.172506 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:13.172428 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2e728dc-359a-4e6e-831e-f1b83f015c97-metrics-tls podName:f2e728dc-359a-4e6e-831e-f1b83f015c97 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:13.672411857 +0000 UTC m=+34.571965907 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f2e728dc-359a-4e6e-831e-f1b83f015c97-metrics-tls") pod "dns-default-nfknt" (UID: "f2e728dc-359a-4e6e-831e-f1b83f015c97") : secret "dns-default-metrics-tls" not found Apr 22 15:59:13.172506 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.172469 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f2e728dc-359a-4e6e-831e-f1b83f015c97-tmp-dir\") pod \"dns-default-nfknt\" (UID: \"f2e728dc-359a-4e6e-831e-f1b83f015c97\") " pod="openshift-dns/dns-default-nfknt" Apr 22 15:59:13.172739 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.172716 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2e728dc-359a-4e6e-831e-f1b83f015c97-config-volume\") pod \"dns-default-nfknt\" (UID: \"f2e728dc-359a-4e6e-831e-f1b83f015c97\") " pod="openshift-dns/dns-default-nfknt" Apr 22 15:59:13.184534 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.184506 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq456\" (UniqueName: \"kubernetes.io/projected/f2e728dc-359a-4e6e-831e-f1b83f015c97-kube-api-access-jq456\") pod \"dns-default-nfknt\" (UID: \"f2e728dc-359a-4e6e-831e-f1b83f015c97\") " pod="openshift-dns/dns-default-nfknt" Apr 22 15:59:13.184704 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.184685 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xndvs\" (UniqueName: \"kubernetes.io/projected/11c29039-8c35-465e-8df3-408a688c08ed-kube-api-access-xndvs\") pod \"ingress-canary-pcv67\" (UID: \"11c29039-8c35-465e-8df3-408a688c08ed\") " pod="openshift-ingress-canary/ingress-canary-pcv67" Apr 22 15:59:13.272792 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.272707 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs\") pod \"network-metrics-daemon-76x4b\" (UID: \"c090a1ee-5091-44d6-9e1b-65bf4dc8b1be\") " pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:59:13.272970 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:13.272847 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:59:13.272970 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:13.272906 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs podName:c090a1ee-5091-44d6-9e1b-65bf4dc8b1be nodeName:}" failed. No retries permitted until 2026-04-22 15:59:45.272890386 +0000 UTC m=+66.172444436 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs") pod "network-metrics-daemon-76x4b" (UID: "c090a1ee-5091-44d6-9e1b-65bf4dc8b1be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:59:13.373036 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.373001 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7m8g\" (UniqueName: \"kubernetes.io/projected/462dda43-d18b-4f55-b5d0-d9b9cbbb2e60-kube-api-access-x7m8g\") pod \"network-check-target-dqwgt\" (UID: \"462dda43-d18b-4f55-b5d0-d9b9cbbb2e60\") " pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:59:13.373248 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:13.373218 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:59:13.373248 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:13.373243 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:59:13.373371 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:13.373258 2572 projected.go:194] Error preparing data for projected volume kube-api-access-x7m8g for pod openshift-network-diagnostics/network-check-target-dqwgt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:59:13.373371 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:13.373328 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/462dda43-d18b-4f55-b5d0-d9b9cbbb2e60-kube-api-access-x7m8g podName:462dda43-d18b-4f55-b5d0-d9b9cbbb2e60 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:45.373308121 +0000 UTC m=+66.272862173 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-x7m8g" (UniqueName: "kubernetes.io/projected/462dda43-d18b-4f55-b5d0-d9b9cbbb2e60-kube-api-access-x7m8g") pod "network-check-target-dqwgt" (UID: "462dda43-d18b-4f55-b5d0-d9b9cbbb2e60") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:59:13.676090 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.676054 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11c29039-8c35-465e-8df3-408a688c08ed-cert\") pod \"ingress-canary-pcv67\" (UID: \"11c29039-8c35-465e-8df3-408a688c08ed\") " pod="openshift-ingress-canary/ingress-canary-pcv67" Apr 22 15:59:13.676781 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:13.676120 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2e728dc-359a-4e6e-831e-f1b83f015c97-metrics-tls\") pod \"dns-default-nfknt\" (UID: \"f2e728dc-359a-4e6e-831e-f1b83f015c97\") " pod="openshift-dns/dns-default-nfknt" Apr 22 15:59:13.676781 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:13.676238 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:59:13.676781 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:13.676234 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:59:13.676781 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:13.676288 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2e728dc-359a-4e6e-831e-f1b83f015c97-metrics-tls podName:f2e728dc-359a-4e6e-831e-f1b83f015c97 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:14.676275037 +0000 UTC m=+35.575829087 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f2e728dc-359a-4e6e-831e-f1b83f015c97-metrics-tls") pod "dns-default-nfknt" (UID: "f2e728dc-359a-4e6e-831e-f1b83f015c97") : secret "dns-default-metrics-tls" not found Apr 22 15:59:13.676781 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:13.676302 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11c29039-8c35-465e-8df3-408a688c08ed-cert podName:11c29039-8c35-465e-8df3-408a688c08ed nodeName:}" failed. No retries permitted until 2026-04-22 15:59:14.676296497 +0000 UTC m=+35.575850546 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/11c29039-8c35-465e-8df3-408a688c08ed-cert") pod "ingress-canary-pcv67" (UID: "11c29039-8c35-465e-8df3-408a688c08ed") : secret "canary-serving-cert" not found Apr 22 15:59:14.640549 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:14.640493 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:59:14.640765 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:14.640504 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:59:14.640765 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:14.640504 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:59:14.643383 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:14.643361 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 15:59:14.644351 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:14.644324 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 15:59:14.644351 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:14.644321 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-52jwv\"" Apr 22 15:59:14.644503 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:14.644365 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jgt28\"" Apr 22 15:59:14.644503 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:14.644376 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 15:59:14.644503 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:14.644325 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 15:59:14.682139 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:14.682104 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2e728dc-359a-4e6e-831e-f1b83f015c97-metrics-tls\") pod \"dns-default-nfknt\" (UID: \"f2e728dc-359a-4e6e-831e-f1b83f015c97\") " pod="openshift-dns/dns-default-nfknt" Apr 22 15:59:14.682514 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:14.682166 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11c29039-8c35-465e-8df3-408a688c08ed-cert\") pod \"ingress-canary-pcv67\" (UID: \"11c29039-8c35-465e-8df3-408a688c08ed\") " pod="openshift-ingress-canary/ingress-canary-pcv67" Apr 22 15:59:14.682514 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:14.682268 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:59:14.682514 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:14.682292 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:59:14.682514 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:14.682328 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2e728dc-359a-4e6e-831e-f1b83f015c97-metrics-tls podName:f2e728dc-359a-4e6e-831e-f1b83f015c97 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:16.682313239 +0000 UTC m=+37.581867292 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f2e728dc-359a-4e6e-831e-f1b83f015c97-metrics-tls") pod "dns-default-nfknt" (UID: "f2e728dc-359a-4e6e-831e-f1b83f015c97") : secret "dns-default-metrics-tls" not found Apr 22 15:59:14.682514 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:14.682368 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11c29039-8c35-465e-8df3-408a688c08ed-cert podName:11c29039-8c35-465e-8df3-408a688c08ed nodeName:}" failed. No retries permitted until 2026-04-22 15:59:16.682334715 +0000 UTC m=+37.581888785 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/11c29039-8c35-465e-8df3-408a688c08ed-cert") pod "ingress-canary-pcv67" (UID: "11c29039-8c35-465e-8df3-408a688c08ed") : secret "canary-serving-cert" not found Apr 22 15:59:14.795505 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:14.795471 2572 generic.go:358] "Generic (PLEG): container finished" podID="a1b11795-9e34-41fd-9198-cc57fa3cfbf7" containerID="201a6666263733825cb673a9a9248d17133da317deeaad5b82ccfb6ff1c2193b" exitCode=0 Apr 22 15:59:14.795675 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:14.795533 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4tvm4" event={"ID":"a1b11795-9e34-41fd-9198-cc57fa3cfbf7","Type":"ContainerDied","Data":"201a6666263733825cb673a9a9248d17133da317deeaad5b82ccfb6ff1c2193b"} Apr 22 15:59:15.387518 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:15.387490 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c2282f87-8976-4573-9b9c-d12f85477077-original-pull-secret\") pod \"global-pull-secret-syncer-rs5cd\" (UID: \"c2282f87-8976-4573-9b9c-d12f85477077\") " pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:59:15.389930 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:15.389908 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c2282f87-8976-4573-9b9c-d12f85477077-original-pull-secret\") pod \"global-pull-secret-syncer-rs5cd\" (UID: \"c2282f87-8976-4573-9b9c-d12f85477077\") " pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:59:15.550853 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:15.550815 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rs5cd" Apr 22 15:59:15.703941 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:15.703766 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rs5cd"] Apr 22 15:59:15.708125 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:59:15.708098 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2282f87_8976_4573_9b9c_d12f85477077.slice/crio-f89b35a484af10474470c84f2d99f206f4402d0986d139c4f025743df43ca83b WatchSource:0}: Error finding container f89b35a484af10474470c84f2d99f206f4402d0986d139c4f025743df43ca83b: Status 404 returned error can't find the container with id f89b35a484af10474470c84f2d99f206f4402d0986d139c4f025743df43ca83b Apr 22 15:59:15.798787 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:15.798753 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rs5cd" event={"ID":"c2282f87-8976-4573-9b9c-d12f85477077","Type":"ContainerStarted","Data":"f89b35a484af10474470c84f2d99f206f4402d0986d139c4f025743df43ca83b"} Apr 22 15:59:15.801328 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:15.801301 2572 generic.go:358] "Generic (PLEG): container finished" podID="a1b11795-9e34-41fd-9198-cc57fa3cfbf7" containerID="2a2e89b1e7188a8d8cb460ee39ddc9b74bf88de17a5092d1a18059b1e9fc3947" exitCode=0 Apr 22 15:59:15.801462 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:15.801355 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4tvm4" event={"ID":"a1b11795-9e34-41fd-9198-cc57fa3cfbf7","Type":"ContainerDied","Data":"2a2e89b1e7188a8d8cb460ee39ddc9b74bf88de17a5092d1a18059b1e9fc3947"} Apr 22 15:59:16.697809 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:16.697753 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2e728dc-359a-4e6e-831e-f1b83f015c97-metrics-tls\") pod \"dns-default-nfknt\" (UID: \"f2e728dc-359a-4e6e-831e-f1b83f015c97\") " pod="openshift-dns/dns-default-nfknt" Apr 22 15:59:16.698008 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:16.697841 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11c29039-8c35-465e-8df3-408a688c08ed-cert\") pod \"ingress-canary-pcv67\" (UID: \"11c29039-8c35-465e-8df3-408a688c08ed\") " pod="openshift-ingress-canary/ingress-canary-pcv67" Apr 22 15:59:16.698008 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:16.697927 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:59:16.698008 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:16.697961 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:59:16.698008 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:16.697994 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2e728dc-359a-4e6e-831e-f1b83f015c97-metrics-tls podName:f2e728dc-359a-4e6e-831e-f1b83f015c97 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:20.697978438 +0000 UTC m=+41.597532510 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f2e728dc-359a-4e6e-831e-f1b83f015c97-metrics-tls") pod "dns-default-nfknt" (UID: "f2e728dc-359a-4e6e-831e-f1b83f015c97") : secret "dns-default-metrics-tls" not found Apr 22 15:59:16.698008 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:16.698010 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11c29039-8c35-465e-8df3-408a688c08ed-cert podName:11c29039-8c35-465e-8df3-408a688c08ed nodeName:}" failed. No retries permitted until 2026-04-22 15:59:20.698004266 +0000 UTC m=+41.597558315 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/11c29039-8c35-465e-8df3-408a688c08ed-cert") pod "ingress-canary-pcv67" (UID: "11c29039-8c35-465e-8df3-408a688c08ed") : secret "canary-serving-cert" not found Apr 22 15:59:16.806665 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:16.806633 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4tvm4" event={"ID":"a1b11795-9e34-41fd-9198-cc57fa3cfbf7","Type":"ContainerStarted","Data":"6e69de32dc441614316fc86cb93023bcfe5c29b1b578f023f4369604171619d7"} Apr 22 15:59:16.828342 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:16.828278 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4tvm4" podStartSLOduration=6.260516569 podStartE2EDuration="37.828258968s" podCreationTimestamp="2026-04-22 15:58:39 +0000 UTC" firstStartedPulling="2026-04-22 15:58:42.216396175 +0000 UTC m=+3.115950230" lastFinishedPulling="2026-04-22 15:59:13.784138577 +0000 UTC m=+34.683692629" observedRunningTime="2026-04-22 15:59:16.8263801 +0000 UTC m=+37.725934171" watchObservedRunningTime="2026-04-22 15:59:16.828258968 +0000 UTC m=+37.727813043" Apr 22 15:59:19.813989 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:19.813742 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rs5cd" event={"ID":"c2282f87-8976-4573-9b9c-d12f85477077","Type":"ContainerStarted","Data":"c2f08f77fbc2652f1b96148803434721e5bcb6c2bd981914c156ace03fa6c610"} Apr 22 15:59:20.729057 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:20.729021 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11c29039-8c35-465e-8df3-408a688c08ed-cert\") pod \"ingress-canary-pcv67\" (UID: \"11c29039-8c35-465e-8df3-408a688c08ed\") " pod="openshift-ingress-canary/ingress-canary-pcv67" Apr 22 15:59:20.729278 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:20.729089 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2e728dc-359a-4e6e-831e-f1b83f015c97-metrics-tls\") pod \"dns-default-nfknt\" (UID: \"f2e728dc-359a-4e6e-831e-f1b83f015c97\") " pod="openshift-dns/dns-default-nfknt" Apr 22 15:59:20.729278 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:20.729188 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:59:20.729278 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:20.729212 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:59:20.729278 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:20.729272 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11c29039-8c35-465e-8df3-408a688c08ed-cert podName:11c29039-8c35-465e-8df3-408a688c08ed nodeName:}" failed. No retries permitted until 2026-04-22 15:59:28.729255621 +0000 UTC m=+49.628809670 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/11c29039-8c35-465e-8df3-408a688c08ed-cert") pod "ingress-canary-pcv67" (UID: "11c29039-8c35-465e-8df3-408a688c08ed") : secret "canary-serving-cert" not found Apr 22 15:59:20.729424 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:20.729288 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2e728dc-359a-4e6e-831e-f1b83f015c97-metrics-tls podName:f2e728dc-359a-4e6e-831e-f1b83f015c97 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:28.72928218 +0000 UTC m=+49.628836229 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f2e728dc-359a-4e6e-831e-f1b83f015c97-metrics-tls") pod "dns-default-nfknt" (UID: "f2e728dc-359a-4e6e-831e-f1b83f015c97") : secret "dns-default-metrics-tls" not found Apr 22 15:59:28.789800 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:28.789757 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11c29039-8c35-465e-8df3-408a688c08ed-cert\") pod \"ingress-canary-pcv67\" (UID: \"11c29039-8c35-465e-8df3-408a688c08ed\") " pod="openshift-ingress-canary/ingress-canary-pcv67" Apr 22 15:59:28.790420 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:28.789857 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2e728dc-359a-4e6e-831e-f1b83f015c97-metrics-tls\") pod \"dns-default-nfknt\" (UID: \"f2e728dc-359a-4e6e-831e-f1b83f015c97\") " pod="openshift-dns/dns-default-nfknt" Apr 22 15:59:28.790420 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:28.789918 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:59:28.790420 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:28.789970 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:59:28.790420 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:28.789995 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11c29039-8c35-465e-8df3-408a688c08ed-cert podName:11c29039-8c35-465e-8df3-408a688c08ed nodeName:}" failed. No retries permitted until 2026-04-22 15:59:44.789977774 +0000 UTC m=+65.689531822 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/11c29039-8c35-465e-8df3-408a688c08ed-cert") pod "ingress-canary-pcv67" (UID: "11c29039-8c35-465e-8df3-408a688c08ed") : secret "canary-serving-cert" not found Apr 22 15:59:28.790420 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:28.790031 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2e728dc-359a-4e6e-831e-f1b83f015c97-metrics-tls podName:f2e728dc-359a-4e6e-831e-f1b83f015c97 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:44.790014438 +0000 UTC m=+65.689568488 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f2e728dc-359a-4e6e-831e-f1b83f015c97-metrics-tls") pod "dns-default-nfknt" (UID: "f2e728dc-359a-4e6e-831e-f1b83f015c97") : secret "dns-default-metrics-tls" not found Apr 22 15:59:38.791439 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:38.791402 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xxznf" Apr 22 15:59:38.816731 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:38.816675 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-rs5cd" podStartSLOduration=51.974524356 podStartE2EDuration="55.816660256s" podCreationTimestamp="2026-04-22 15:58:43 +0000 UTC" firstStartedPulling="2026-04-22 15:59:15.709799594 +0000 UTC m=+36.609353646" lastFinishedPulling="2026-04-22 15:59:19.55193548 +0000 UTC m=+40.451489546" observedRunningTime="2026-04-22 15:59:19.827326001 +0000 UTC m=+40.726880073" watchObservedRunningTime="2026-04-22 15:59:38.816660256 +0000 UTC m=+59.716214330" Apr 22 15:59:44.799517 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:44.799475 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2e728dc-359a-4e6e-831e-f1b83f015c97-metrics-tls\") pod \"dns-default-nfknt\" (UID: \"f2e728dc-359a-4e6e-831e-f1b83f015c97\") " pod="openshift-dns/dns-default-nfknt" Apr 22 15:59:44.799517 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:44.799526 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11c29039-8c35-465e-8df3-408a688c08ed-cert\") pod \"ingress-canary-pcv67\" (UID: \"11c29039-8c35-465e-8df3-408a688c08ed\") " pod="openshift-ingress-canary/ingress-canary-pcv67" Apr 22 15:59:44.799928 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:44.799610 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:59:44.799928 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:44.799615 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:59:44.799928 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:44.799663 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11c29039-8c35-465e-8df3-408a688c08ed-cert podName:11c29039-8c35-465e-8df3-408a688c08ed nodeName:}" failed. No retries permitted until 2026-04-22 16:00:16.799649615 +0000 UTC m=+97.699203665 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/11c29039-8c35-465e-8df3-408a688c08ed-cert") pod "ingress-canary-pcv67" (UID: "11c29039-8c35-465e-8df3-408a688c08ed") : secret "canary-serving-cert" not found Apr 22 15:59:44.799928 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:44.799675 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2e728dc-359a-4e6e-831e-f1b83f015c97-metrics-tls podName:f2e728dc-359a-4e6e-831e-f1b83f015c97 nodeName:}" failed. No retries permitted until 2026-04-22 16:00:16.799669592 +0000 UTC m=+97.699223640 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f2e728dc-359a-4e6e-831e-f1b83f015c97-metrics-tls") pod "dns-default-nfknt" (UID: "f2e728dc-359a-4e6e-831e-f1b83f015c97") : secret "dns-default-metrics-tls" not found Apr 22 15:59:45.302550 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:45.302512 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs\") pod \"network-metrics-daemon-76x4b\" (UID: \"c090a1ee-5091-44d6-9e1b-65bf4dc8b1be\") " pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 15:59:45.305047 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:45.305025 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 15:59:45.312726 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:45.312706 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 15:59:45.312775 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:45.312765 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs podName:c090a1ee-5091-44d6-9e1b-65bf4dc8b1be nodeName:}" failed. No retries permitted until 2026-04-22 16:00:49.312748841 +0000 UTC m=+130.212302889 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs") pod "network-metrics-daemon-76x4b" (UID: "c090a1ee-5091-44d6-9e1b-65bf4dc8b1be") : secret "metrics-daemon-secret" not found Apr 22 15:59:45.402890 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:45.402844 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7m8g\" (UniqueName: \"kubernetes.io/projected/462dda43-d18b-4f55-b5d0-d9b9cbbb2e60-kube-api-access-x7m8g\") pod \"network-check-target-dqwgt\" (UID: \"462dda43-d18b-4f55-b5d0-d9b9cbbb2e60\") " pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:59:45.405408 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:45.405391 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 15:59:45.415782 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:45.415762 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 15:59:45.427688 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:45.427656 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7m8g\" (UniqueName: \"kubernetes.io/projected/462dda43-d18b-4f55-b5d0-d9b9cbbb2e60-kube-api-access-x7m8g\") pod \"network-check-target-dqwgt\" (UID: \"462dda43-d18b-4f55-b5d0-d9b9cbbb2e60\") " pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:59:45.558060 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:45.557989 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jgt28\"" Apr 22 15:59:45.565921 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:45.565895 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:59:45.689577 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:45.689545 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dqwgt"] Apr 22 15:59:45.693610 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:59:45.693579 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod462dda43_d18b_4f55_b5d0_d9b9cbbb2e60.slice/crio-08770e3313c573520dd4d48454c8bc4157cf471c8f78105653e93a2b66e2687a WatchSource:0}: Error finding container 08770e3313c573520dd4d48454c8bc4157cf471c8f78105653e93a2b66e2687a: Status 404 returned error can't find the container with id 08770e3313c573520dd4d48454c8bc4157cf471c8f78105653e93a2b66e2687a Apr 22 15:59:45.866885 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:45.866854 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dqwgt" event={"ID":"462dda43-d18b-4f55-b5d0-d9b9cbbb2e60","Type":"ContainerStarted","Data":"08770e3313c573520dd4d48454c8bc4157cf471c8f78105653e93a2b66e2687a"} Apr 22 15:59:46.584135 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.584089 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-jbznj"] Apr 22 15:59:46.586389 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.586363 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-jbznj" Apr 22 15:59:46.588537 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.588508 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 15:59:46.588685 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.588569 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-nxgcq\"" Apr 22 15:59:46.588685 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.588573 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 15:59:46.589419 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.589388 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 15:59:46.589419 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.589403 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 15:59:46.594935 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.594911 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 15:59:46.604984 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.604950 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-jbznj"] Apr 22 15:59:46.610623 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.610593 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/14ff43e0-e359-4557-9f79-d5452a8479a0-snapshots\") pod \"insights-operator-585dfdc468-jbznj\" (UID: \"14ff43e0-e359-4557-9f79-d5452a8479a0\") " pod="openshift-insights/insights-operator-585dfdc468-jbznj" Apr 22 15:59:46.610790 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.610645 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14ff43e0-e359-4557-9f79-d5452a8479a0-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-jbznj\" (UID: \"14ff43e0-e359-4557-9f79-d5452a8479a0\") " pod="openshift-insights/insights-operator-585dfdc468-jbznj" Apr 22 15:59:46.612224 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.611018 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/14ff43e0-e359-4557-9f79-d5452a8479a0-tmp\") pod \"insights-operator-585dfdc468-jbznj\" (UID: \"14ff43e0-e359-4557-9f79-d5452a8479a0\") " pod="openshift-insights/insights-operator-585dfdc468-jbznj" Apr 22 15:59:46.612224 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.611087 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14ff43e0-e359-4557-9f79-d5452a8479a0-service-ca-bundle\") pod \"insights-operator-585dfdc468-jbznj\" (UID: \"14ff43e0-e359-4557-9f79-d5452a8479a0\") " pod="openshift-insights/insights-operator-585dfdc468-jbznj" Apr 22 15:59:46.711647 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.711595 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/14ff43e0-e359-4557-9f79-d5452a8479a0-tmp\") pod \"insights-operator-585dfdc468-jbznj\" (UID: \"14ff43e0-e359-4557-9f79-d5452a8479a0\") " pod="openshift-insights/insights-operator-585dfdc468-jbznj" Apr 22 15:59:46.711837 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.711654 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14ff43e0-e359-4557-9f79-d5452a8479a0-service-ca-bundle\") pod \"insights-operator-585dfdc468-jbznj\" (UID: \"14ff43e0-e359-4557-9f79-d5452a8479a0\") " pod="openshift-insights/insights-operator-585dfdc468-jbznj" Apr 22 15:59:46.711837 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.711718 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/14ff43e0-e359-4557-9f79-d5452a8479a0-snapshots\") pod \"insights-operator-585dfdc468-jbznj\" (UID: \"14ff43e0-e359-4557-9f79-d5452a8479a0\") " pod="openshift-insights/insights-operator-585dfdc468-jbznj" Apr 22 15:59:46.711837 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.711755 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45qnn\" (UniqueName: \"kubernetes.io/projected/14ff43e0-e359-4557-9f79-d5452a8479a0-kube-api-access-45qnn\") pod \"insights-operator-585dfdc468-jbznj\" (UID: \"14ff43e0-e359-4557-9f79-d5452a8479a0\") " pod="openshift-insights/insights-operator-585dfdc468-jbznj" Apr 22 15:59:46.711837 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.711797 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14ff43e0-e359-4557-9f79-d5452a8479a0-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-jbznj\" (UID: \"14ff43e0-e359-4557-9f79-d5452a8479a0\") " pod="openshift-insights/insights-operator-585dfdc468-jbznj" Apr 22 15:59:46.712053 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.711852 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14ff43e0-e359-4557-9f79-d5452a8479a0-serving-cert\") pod \"insights-operator-585dfdc468-jbznj\" (UID: \"14ff43e0-e359-4557-9f79-d5452a8479a0\") " pod="openshift-insights/insights-operator-585dfdc468-jbznj" Apr 22 15:59:46.712106 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.712073 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/14ff43e0-e359-4557-9f79-d5452a8479a0-tmp\") pod \"insights-operator-585dfdc468-jbznj\" (UID: \"14ff43e0-e359-4557-9f79-d5452a8479a0\") " pod="openshift-insights/insights-operator-585dfdc468-jbznj" Apr 22 15:59:46.712433 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.712403 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14ff43e0-e359-4557-9f79-d5452a8479a0-service-ca-bundle\") pod \"insights-operator-585dfdc468-jbznj\" (UID: \"14ff43e0-e359-4557-9f79-d5452a8479a0\") " pod="openshift-insights/insights-operator-585dfdc468-jbznj" Apr 22 15:59:46.712701 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.712678 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14ff43e0-e359-4557-9f79-d5452a8479a0-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-jbznj\" (UID: \"14ff43e0-e359-4557-9f79-d5452a8479a0\") " pod="openshift-insights/insights-operator-585dfdc468-jbznj" Apr 22 15:59:46.712963 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.712942 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/14ff43e0-e359-4557-9f79-d5452a8479a0-snapshots\") pod \"insights-operator-585dfdc468-jbznj\" (UID: \"14ff43e0-e359-4557-9f79-d5452a8479a0\") " pod="openshift-insights/insights-operator-585dfdc468-jbznj" Apr 22 15:59:46.812776 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.812740 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-45qnn\" (UniqueName: \"kubernetes.io/projected/14ff43e0-e359-4557-9f79-d5452a8479a0-kube-api-access-45qnn\") pod \"insights-operator-585dfdc468-jbznj\" (UID: \"14ff43e0-e359-4557-9f79-d5452a8479a0\") " pod="openshift-insights/insights-operator-585dfdc468-jbznj" Apr 22 15:59:46.812966 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.812833 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14ff43e0-e359-4557-9f79-d5452a8479a0-serving-cert\") pod \"insights-operator-585dfdc468-jbznj\" (UID: \"14ff43e0-e359-4557-9f79-d5452a8479a0\") " pod="openshift-insights/insights-operator-585dfdc468-jbznj" Apr 22 15:59:46.815448 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.815421 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14ff43e0-e359-4557-9f79-d5452a8479a0-serving-cert\") pod \"insights-operator-585dfdc468-jbznj\" (UID: \"14ff43e0-e359-4557-9f79-d5452a8479a0\") " pod="openshift-insights/insights-operator-585dfdc468-jbznj" Apr 22 15:59:46.821552 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.821518 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-45qnn\" (UniqueName: \"kubernetes.io/projected/14ff43e0-e359-4557-9f79-d5452a8479a0-kube-api-access-45qnn\") pod \"insights-operator-585dfdc468-jbznj\" (UID: \"14ff43e0-e359-4557-9f79-d5452a8479a0\") " pod="openshift-insights/insights-operator-585dfdc468-jbznj" Apr 22 15:59:46.898044 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:46.897954 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-jbznj" Apr 22 15:59:47.034304 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:47.034269 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-jbznj"] Apr 22 15:59:47.037506 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:59:47.037480 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14ff43e0_e359_4557_9f79_d5452a8479a0.slice/crio-77a4de484c3da23ebd4fe677c9c80d1e3e60580c1e218828ea17f773d2033dbc WatchSource:0}: Error finding container 77a4de484c3da23ebd4fe677c9c80d1e3e60580c1e218828ea17f773d2033dbc: Status 404 returned error can't find the container with id 77a4de484c3da23ebd4fe677c9c80d1e3e60580c1e218828ea17f773d2033dbc Apr 22 15:59:47.872249 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:47.872181 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-jbznj" event={"ID":"14ff43e0-e359-4557-9f79-d5452a8479a0","Type":"ContainerStarted","Data":"77a4de484c3da23ebd4fe677c9c80d1e3e60580c1e218828ea17f773d2033dbc"} Apr 22 15:59:49.877363 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:49.877323 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dqwgt" event={"ID":"462dda43-d18b-4f55-b5d0-d9b9cbbb2e60","Type":"ContainerStarted","Data":"31b79d90e94c85f672df53d56d38385100a4292a09de384dc6d4da47b7df9eaf"} Apr 22 15:59:49.877829 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:49.877489 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 15:59:49.892434 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:49.892384 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-dqwgt" podStartSLOduration=67.753991334 podStartE2EDuration="1m10.892370378s" podCreationTimestamp="2026-04-22 15:58:39 +0000 UTC" firstStartedPulling="2026-04-22 15:59:45.696131367 +0000 UTC m=+66.595685416" lastFinishedPulling="2026-04-22 15:59:48.834510408 +0000 UTC m=+69.734064460" observedRunningTime="2026-04-22 15:59:49.891413582 +0000 UTC m=+70.790967655" watchObservedRunningTime="2026-04-22 15:59:49.892370378 +0000 UTC m=+70.791924448" Apr 22 15:59:50.518291 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:50.518246 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xsb7b"] Apr 22 15:59:50.548021 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:50.547987 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xsb7b"] Apr 22 15:59:50.548233 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:50.548119 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xsb7b" Apr 22 15:59:50.550871 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:50.550843 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-nvw9b\"" Apr 22 15:59:50.551002 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:50.550870 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:59:50.551798 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:50.551782 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 15:59:50.636055 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:50.636025 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74b88\" (UniqueName: \"kubernetes.io/projected/c859ac10-b350-418d-b543-fde1e18ef074-kube-api-access-74b88\") pod \"volume-data-source-validator-7c6cbb6c87-xsb7b\" (UID: \"c859ac10-b350-418d-b543-fde1e18ef074\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xsb7b" Apr 22 15:59:50.736339 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:50.736307 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74b88\" (UniqueName: \"kubernetes.io/projected/c859ac10-b350-418d-b543-fde1e18ef074-kube-api-access-74b88\") pod \"volume-data-source-validator-7c6cbb6c87-xsb7b\" (UID: \"c859ac10-b350-418d-b543-fde1e18ef074\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xsb7b" Apr 22 15:59:50.743834 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:50.743813 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74b88\" (UniqueName: \"kubernetes.io/projected/c859ac10-b350-418d-b543-fde1e18ef074-kube-api-access-74b88\") pod \"volume-data-source-validator-7c6cbb6c87-xsb7b\" (UID: \"c859ac10-b350-418d-b543-fde1e18ef074\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xsb7b" Apr 22 15:59:50.857162 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:50.857121 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xsb7b" Apr 22 15:59:51.056026 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:51.055990 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xsb7b"] Apr 22 15:59:51.059448 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:59:51.059422 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc859ac10_b350_418d_b543_fde1e18ef074.slice/crio-db3cf32dfb96c3c9438f7e6b00a679dc855ee543ea764dd1613fa07a1f5e0fe9 WatchSource:0}: Error finding container db3cf32dfb96c3c9438f7e6b00a679dc855ee543ea764dd1613fa07a1f5e0fe9: Status 404 returned error can't find the container with id db3cf32dfb96c3c9438f7e6b00a679dc855ee543ea764dd1613fa07a1f5e0fe9 Apr 22 15:59:51.883299 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:51.883257 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-jbznj" event={"ID":"14ff43e0-e359-4557-9f79-d5452a8479a0","Type":"ContainerStarted","Data":"4eff5c411b6e1b25f382feee3ed7a49ce01379fcd3b9cf4c4a4a4b194d7b3774"} Apr 22 15:59:51.884415 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:51.884384 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xsb7b" event={"ID":"c859ac10-b350-418d-b543-fde1e18ef074","Type":"ContainerStarted","Data":"db3cf32dfb96c3c9438f7e6b00a679dc855ee543ea764dd1613fa07a1f5e0fe9"} Apr 22 15:59:51.898877 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:51.898822 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-jbznj" podStartSLOduration=1.995469993 podStartE2EDuration="5.898806306s" podCreationTimestamp="2026-04-22 15:59:46 +0000 UTC" firstStartedPulling="2026-04-22 15:59:47.039594928 +0000 UTC m=+67.939148988" lastFinishedPulling="2026-04-22 15:59:50.942931238 +0000 UTC m=+71.842485301" observedRunningTime="2026-04-22 15:59:51.898352072 +0000 UTC m=+72.797906143" watchObservedRunningTime="2026-04-22 15:59:51.898806306 +0000 UTC m=+72.798360416" Apr 22 15:59:52.887966 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:52.887920 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xsb7b" event={"ID":"c859ac10-b350-418d-b543-fde1e18ef074","Type":"ContainerStarted","Data":"9904c728127adb3ff2bb50549a8800ac5147ab20e828f7a85178d9ec5ba450c8"} Apr 22 15:59:52.901680 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:52.901626 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xsb7b" podStartSLOduration=1.287292503 podStartE2EDuration="2.901612366s" podCreationTimestamp="2026-04-22 15:59:50 +0000 UTC" firstStartedPulling="2026-04-22 15:59:51.061302854 +0000 UTC m=+71.960856905" lastFinishedPulling="2026-04-22 15:59:52.675622719 +0000 UTC m=+73.575176768" observedRunningTime="2026-04-22 15:59:52.901238127 +0000 UTC m=+73.800792197" watchObservedRunningTime="2026-04-22 15:59:52.901612366 +0000 UTC m=+73.801166414" Apr 22 15:59:53.816998 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:53.816967 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4sqfs_a60066e5-252b-4865-879a-0d0d3a6618d4/dns-node-resolver/0.log" Apr 22 15:59:55.016871 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:55.016840 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9d6jl_5756e223-5da3-420b-a640-5e3cdce35004/node-ca/0.log" Apr 22 15:59:55.537882 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:55.537850 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j58f5"] Apr 22 15:59:55.540672 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:55.540656 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j58f5" Apr 22 15:59:55.542839 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:55.542817 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 15:59:55.542977 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:55.542901 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 15:59:55.542977 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:55.542942 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:59:55.543696 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:55.543683 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 15:59:55.543751 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:55.543711 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-kb528\"" Apr 22 15:59:55.549504 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:55.549478 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j58f5"] Apr 22 15:59:55.562347 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:55.562321 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d7ddd84-35ed-400b-ad69-647f50964d8c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-j58f5\" (UID: \"8d7ddd84-35ed-400b-ad69-647f50964d8c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j58f5" Apr 22 15:59:55.562464 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:55.562400 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d7ddd84-35ed-400b-ad69-647f50964d8c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-j58f5\" (UID: \"8d7ddd84-35ed-400b-ad69-647f50964d8c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j58f5" Apr 22 15:59:55.562464 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:55.562431 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kr2n\" (UniqueName: \"kubernetes.io/projected/8d7ddd84-35ed-400b-ad69-647f50964d8c-kube-api-access-8kr2n\") pod \"kube-storage-version-migrator-operator-6769c5d45-j58f5\" (UID: \"8d7ddd84-35ed-400b-ad69-647f50964d8c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j58f5" Apr 22 15:59:55.662745 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:55.662711 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d7ddd84-35ed-400b-ad69-647f50964d8c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-j58f5\" (UID: \"8d7ddd84-35ed-400b-ad69-647f50964d8c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j58f5" Apr 22 15:59:55.662904 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:55.662755 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kr2n\" (UniqueName: \"kubernetes.io/projected/8d7ddd84-35ed-400b-ad69-647f50964d8c-kube-api-access-8kr2n\") pod \"kube-storage-version-migrator-operator-6769c5d45-j58f5\" (UID: \"8d7ddd84-35ed-400b-ad69-647f50964d8c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j58f5" Apr 22 15:59:55.662904 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:55.662787 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d7ddd84-35ed-400b-ad69-647f50964d8c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-j58f5\" (UID: \"8d7ddd84-35ed-400b-ad69-647f50964d8c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j58f5" Apr 22 15:59:55.663233 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:55.663179 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d7ddd84-35ed-400b-ad69-647f50964d8c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-j58f5\" (UID: \"8d7ddd84-35ed-400b-ad69-647f50964d8c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j58f5" Apr 22 15:59:55.665097 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:55.665078 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d7ddd84-35ed-400b-ad69-647f50964d8c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-j58f5\" (UID: \"8d7ddd84-35ed-400b-ad69-647f50964d8c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j58f5" Apr 22 15:59:55.670381 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:55.670353 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kr2n\" (UniqueName: \"kubernetes.io/projected/8d7ddd84-35ed-400b-ad69-647f50964d8c-kube-api-access-8kr2n\") pod \"kube-storage-version-migrator-operator-6769c5d45-j58f5\" (UID: \"8d7ddd84-35ed-400b-ad69-647f50964d8c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j58f5" Apr 22 15:59:55.850244 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:55.850170 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j58f5" Apr 22 15:59:55.965697 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:55.965666 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j58f5"] Apr 22 15:59:55.969266 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:59:55.969185 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d7ddd84_35ed_400b_ad69_647f50964d8c.slice/crio-481d0c89e5727b20e16e1861017fdf1ea81a419113ee9b0cc45bf4b616e7318e WatchSource:0}: Error finding container 481d0c89e5727b20e16e1861017fdf1ea81a419113ee9b0cc45bf4b616e7318e: Status 404 returned error can't find the container with id 481d0c89e5727b20e16e1861017fdf1ea81a419113ee9b0cc45bf4b616e7318e Apr 22 15:59:56.558610 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:56.558577 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rm52c"] Apr 22 15:59:56.562675 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:56.562656 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rm52c" Apr 22 15:59:56.564788 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:56.564765 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 15:59:56.564898 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:56.564817 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:59:56.564898 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:56.564822 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-p6d79\"" Apr 22 15:59:56.565979 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:56.565964 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 15:59:56.566065 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:56.565999 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 15:59:56.567956 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:56.567929 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59310538-efb1-4059-a9c2-3dac6061df75-serving-cert\") pod \"service-ca-operator-d6fc45fc5-rm52c\" (UID: \"59310538-efb1-4059-a9c2-3dac6061df75\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rm52c" Apr 22 15:59:56.568067 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:56.567958 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59310538-efb1-4059-a9c2-3dac6061df75-config\") pod \"service-ca-operator-d6fc45fc5-rm52c\" (UID: \"59310538-efb1-4059-a9c2-3dac6061df75\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rm52c" Apr 22 15:59:56.568067 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:56.568052 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4npx\" (UniqueName: \"kubernetes.io/projected/59310538-efb1-4059-a9c2-3dac6061df75-kube-api-access-j4npx\") pod \"service-ca-operator-d6fc45fc5-rm52c\" (UID: \"59310538-efb1-4059-a9c2-3dac6061df75\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rm52c" Apr 22 15:59:56.570705 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:56.570685 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rm52c"] Apr 22 15:59:56.668641 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:56.668612 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59310538-efb1-4059-a9c2-3dac6061df75-serving-cert\") pod \"service-ca-operator-d6fc45fc5-rm52c\" (UID: \"59310538-efb1-4059-a9c2-3dac6061df75\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rm52c" Apr 22 15:59:56.668641 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:56.668646 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59310538-efb1-4059-a9c2-3dac6061df75-config\") pod \"service-ca-operator-d6fc45fc5-rm52c\" (UID: \"59310538-efb1-4059-a9c2-3dac6061df75\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rm52c" Apr 22 15:59:56.670223 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:56.668941 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4npx\" (UniqueName: \"kubernetes.io/projected/59310538-efb1-4059-a9c2-3dac6061df75-kube-api-access-j4npx\") pod \"service-ca-operator-d6fc45fc5-rm52c\" (UID: \"59310538-efb1-4059-a9c2-3dac6061df75\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rm52c" Apr 22 15:59:56.670223 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:56.669676 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59310538-efb1-4059-a9c2-3dac6061df75-config\") pod \"service-ca-operator-d6fc45fc5-rm52c\" (UID: \"59310538-efb1-4059-a9c2-3dac6061df75\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rm52c" Apr 22 15:59:56.671627 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:56.671603 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59310538-efb1-4059-a9c2-3dac6061df75-serving-cert\") pod \"service-ca-operator-d6fc45fc5-rm52c\" (UID: \"59310538-efb1-4059-a9c2-3dac6061df75\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rm52c" Apr 22 15:59:56.677126 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:56.677104 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4npx\" (UniqueName: \"kubernetes.io/projected/59310538-efb1-4059-a9c2-3dac6061df75-kube-api-access-j4npx\") pod \"service-ca-operator-d6fc45fc5-rm52c\" (UID: \"59310538-efb1-4059-a9c2-3dac6061df75\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rm52c" Apr 22 15:59:56.872110 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:56.872067 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rm52c" Apr 22 15:59:56.898940 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:56.898905 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j58f5" event={"ID":"8d7ddd84-35ed-400b-ad69-647f50964d8c","Type":"ContainerStarted","Data":"481d0c89e5727b20e16e1861017fdf1ea81a419113ee9b0cc45bf4b616e7318e"} Apr 22 15:59:57.005632 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.005596 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rm52c"] Apr 22 15:59:57.008647 ip-10-0-135-9 kubenswrapper[2572]: W0422 15:59:57.008616 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59310538_efb1_4059_a9c2_3dac6061df75.slice/crio-2991960ebb2d5c8f33720575421c1f2838c83e4467d144ec61891b477eb7f07d WatchSource:0}: Error finding container 2991960ebb2d5c8f33720575421c1f2838c83e4467d144ec61891b477eb7f07d: Status 404 returned error can't find the container with id 2991960ebb2d5c8f33720575421c1f2838c83e4467d144ec61891b477eb7f07d Apr 22 15:59:57.127475 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.127394 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-65cfbcb6dc-lbj9q"] Apr 22 15:59:57.131781 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.131754 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.134335 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.134302 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-4mqmh\"" Apr 22 15:59:57.134335 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.134326 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 15:59:57.134528 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.134326 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 15:59:57.134528 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.134470 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 15:59:57.140531 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.139743 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 15:59:57.140531 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.140486 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-65cfbcb6dc-lbj9q"] Apr 22 15:59:57.173348 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.173312 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bce0e47e-7873-4e15-8bf5-defeba101e19-image-registry-private-configuration\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.173348 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.173348 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bce0e47e-7873-4e15-8bf5-defeba101e19-trusted-ca\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.173576 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.173392 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-tls\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.173576 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.173457 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-certificates\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.173576 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.173482 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-bound-sa-token\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.173576 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.173498 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bce0e47e-7873-4e15-8bf5-defeba101e19-ca-trust-extracted\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.173576 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.173516 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bce0e47e-7873-4e15-8bf5-defeba101e19-installation-pull-secrets\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.173824 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.173634 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq4p2\" (UniqueName: \"kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-kube-api-access-dq4p2\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.273969 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.273934 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-tls\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.274118 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.273988 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-certificates\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.274118 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.274011 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-bound-sa-token\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.274118 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.274029 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bce0e47e-7873-4e15-8bf5-defeba101e19-ca-trust-extracted\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.274118 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.274052 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bce0e47e-7873-4e15-8bf5-defeba101e19-installation-pull-secrets\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.274118 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:57.274080 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:59:57.274118 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:57.274103 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65cfbcb6dc-lbj9q: secret "image-registry-tls" not found Apr 22 15:59:57.274933 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:57.274176 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-tls podName:bce0e47e-7873-4e15-8bf5-defeba101e19 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:57.774156734 +0000 UTC m=+78.673710802 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-tls") pod "image-registry-65cfbcb6dc-lbj9q" (UID: "bce0e47e-7873-4e15-8bf5-defeba101e19") : secret "image-registry-tls" not found Apr 22 15:59:57.274933 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.274086 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dq4p2\" (UniqueName: \"kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-kube-api-access-dq4p2\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.274933 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.274321 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bce0e47e-7873-4e15-8bf5-defeba101e19-image-registry-private-configuration\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.274933 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.274364 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bce0e47e-7873-4e15-8bf5-defeba101e19-trusted-ca\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.274933 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.274501 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bce0e47e-7873-4e15-8bf5-defeba101e19-ca-trust-extracted\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.274933 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.274674 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-certificates\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.275310 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.275290 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bce0e47e-7873-4e15-8bf5-defeba101e19-trusted-ca\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.276646 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.276630 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bce0e47e-7873-4e15-8bf5-defeba101e19-image-registry-private-configuration\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.276704 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.276688 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bce0e47e-7873-4e15-8bf5-defeba101e19-installation-pull-secrets\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.282322 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.282295 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-bound-sa-token\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.282405 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.282354 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq4p2\" (UniqueName: \"kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-kube-api-access-dq4p2\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.778007 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.777969 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-tls\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:57.778444 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:57.778098 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:59:57.778444 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:57.778113 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65cfbcb6dc-lbj9q: secret "image-registry-tls" not found Apr 22 15:59:57.778444 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:57.778175 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-tls podName:bce0e47e-7873-4e15-8bf5-defeba101e19 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:58.778151589 +0000 UTC m=+79.677705646 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-tls") pod "image-registry-65cfbcb6dc-lbj9q" (UID: "bce0e47e-7873-4e15-8bf5-defeba101e19") : secret "image-registry-tls" not found Apr 22 15:59:57.902841 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:57.902794 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rm52c" event={"ID":"59310538-efb1-4059-a9c2-3dac6061df75","Type":"ContainerStarted","Data":"2991960ebb2d5c8f33720575421c1f2838c83e4467d144ec61891b477eb7f07d"} Apr 22 15:59:58.786653 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:58.786564 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-tls\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 15:59:58.787091 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:58.786701 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:59:58.787091 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:58.786720 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65cfbcb6dc-lbj9q: secret "image-registry-tls" not found Apr 22 15:59:58.787091 ip-10-0-135-9 kubenswrapper[2572]: E0422 15:59:58.786788 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-tls podName:bce0e47e-7873-4e15-8bf5-defeba101e19 nodeName:}" failed. No retries permitted until 2026-04-22 16:00:00.78676994 +0000 UTC m=+81.686324013 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-tls") pod "image-registry-65cfbcb6dc-lbj9q" (UID: "bce0e47e-7873-4e15-8bf5-defeba101e19") : secret "image-registry-tls" not found Apr 22 15:59:58.907466 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:58.907418 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j58f5" event={"ID":"8d7ddd84-35ed-400b-ad69-647f50964d8c","Type":"ContainerStarted","Data":"ac49a8404b39aa902a0ada07c259fc70860b9483e29bdadf43278dfcce94c8ca"} Apr 22 15:59:58.922264 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:58.922212 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j58f5" podStartSLOduration=1.524473601 podStartE2EDuration="3.922180523s" podCreationTimestamp="2026-04-22 15:59:55 +0000 UTC" firstStartedPulling="2026-04-22 15:59:55.971084479 +0000 UTC m=+76.870638528" lastFinishedPulling="2026-04-22 15:59:58.368791392 +0000 UTC m=+79.268345450" observedRunningTime="2026-04-22 15:59:58.921299778 +0000 UTC m=+79.820853851" watchObservedRunningTime="2026-04-22 15:59:58.922180523 +0000 UTC m=+79.821734594" Apr 22 15:59:59.676335 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:59.676296 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-shl8s"] Apr 22 15:59:59.679693 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:59.679672 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-shl8s" Apr 22 15:59:59.682529 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:59.682505 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 15:59:59.682626 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:59.682519 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-k9s66\"" Apr 22 15:59:59.682797 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:59.682780 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 15:59:59.696506 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:59.696465 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-shl8s"] Apr 22 15:59:59.696675 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:59.696578 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbhcz\" (UniqueName: \"kubernetes.io/projected/cd8b8745-3849-4e7e-a5f5-60ba35f4329e-kube-api-access-kbhcz\") pod \"migrator-74bb7799d9-shl8s\" (UID: \"cd8b8745-3849-4e7e-a5f5-60ba35f4329e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-shl8s" Apr 22 15:59:59.797636 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:59.797596 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbhcz\" (UniqueName: \"kubernetes.io/projected/cd8b8745-3849-4e7e-a5f5-60ba35f4329e-kube-api-access-kbhcz\") pod \"migrator-74bb7799d9-shl8s\" (UID: \"cd8b8745-3849-4e7e-a5f5-60ba35f4329e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-shl8s" Apr 22 15:59:59.805180 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:59.805148 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbhcz\" (UniqueName: \"kubernetes.io/projected/cd8b8745-3849-4e7e-a5f5-60ba35f4329e-kube-api-access-kbhcz\") pod \"migrator-74bb7799d9-shl8s\" (UID: \"cd8b8745-3849-4e7e-a5f5-60ba35f4329e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-shl8s" Apr 22 15:59:59.911258 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:59.911221 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rm52c" event={"ID":"59310538-efb1-4059-a9c2-3dac6061df75","Type":"ContainerStarted","Data":"c9c2a80f899a605505b8cbcd156d0a1f74e7c364e948a18d6753d7462231342d"} Apr 22 15:59:59.925086 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:59.925028 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rm52c" podStartSLOduration=1.7574512100000002 podStartE2EDuration="3.925012449s" podCreationTimestamp="2026-04-22 15:59:56 +0000 UTC" firstStartedPulling="2026-04-22 15:59:57.010553416 +0000 UTC m=+77.910107468" lastFinishedPulling="2026-04-22 15:59:59.178114656 +0000 UTC m=+80.077668707" observedRunningTime="2026-04-22 15:59:59.924565157 +0000 UTC m=+80.824119229" watchObservedRunningTime="2026-04-22 15:59:59.925012449 +0000 UTC m=+80.824566519" Apr 22 15:59:59.990605 ip-10-0-135-9 kubenswrapper[2572]: I0422 15:59:59.990531 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-shl8s" Apr 22 16:00:00.119324 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:00.119237 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-shl8s"] Apr 22 16:00:00.121779 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:00:00.121750 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd8b8745_3849_4e7e_a5f5_60ba35f4329e.slice/crio-6150a3441ebfeccf0ef22b090dfad2057f3cacf73938ebde60a789acde5f2647 WatchSource:0}: Error finding container 6150a3441ebfeccf0ef22b090dfad2057f3cacf73938ebde60a789acde5f2647: Status 404 returned error can't find the container with id 6150a3441ebfeccf0ef22b090dfad2057f3cacf73938ebde60a789acde5f2647 Apr 22 16:00:00.806513 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:00.806472 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-tls\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 16:00:00.807072 ip-10-0-135-9 kubenswrapper[2572]: E0422 16:00:00.806634 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 16:00:00.807072 ip-10-0-135-9 kubenswrapper[2572]: E0422 16:00:00.806655 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65cfbcb6dc-lbj9q: secret "image-registry-tls" not found Apr 22 16:00:00.807072 ip-10-0-135-9 kubenswrapper[2572]: E0422 16:00:00.806713 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-tls podName:bce0e47e-7873-4e15-8bf5-defeba101e19 nodeName:}" failed. No retries permitted until 2026-04-22 16:00:04.806694234 +0000 UTC m=+85.706248283 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-tls") pod "image-registry-65cfbcb6dc-lbj9q" (UID: "bce0e47e-7873-4e15-8bf5-defeba101e19") : secret "image-registry-tls" not found Apr 22 16:00:00.915251 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:00.915219 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-shl8s" event={"ID":"cd8b8745-3849-4e7e-a5f5-60ba35f4329e","Type":"ContainerStarted","Data":"6150a3441ebfeccf0ef22b090dfad2057f3cacf73938ebde60a789acde5f2647"} Apr 22 16:00:01.919340 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:01.919294 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-shl8s" event={"ID":"cd8b8745-3849-4e7e-a5f5-60ba35f4329e","Type":"ContainerStarted","Data":"b4ba03475aa93c707d2dfa3e74e8ce3ee1c81c004300e21dc24584eb5f4ade8f"} Apr 22 16:00:01.919340 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:01.919335 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-shl8s" event={"ID":"cd8b8745-3849-4e7e-a5f5-60ba35f4329e","Type":"ContainerStarted","Data":"aa1a6b5816e77aa5209c598d9fd1a9516869a747de40cdf3dbb968a83da177cd"} Apr 22 16:00:01.933808 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:01.933757 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-shl8s" podStartSLOduration=1.610024345 podStartE2EDuration="2.933743816s" podCreationTimestamp="2026-04-22 15:59:59 +0000 UTC" firstStartedPulling="2026-04-22 16:00:00.123703236 +0000 UTC m=+81.023257284" lastFinishedPulling="2026-04-22 16:00:01.447422702 +0000 UTC m=+82.346976755" observedRunningTime="2026-04-22 16:00:01.93242622 +0000 UTC m=+82.831980293" watchObservedRunningTime="2026-04-22 16:00:01.933743816 +0000 UTC m=+82.833297887" Apr 22 16:00:04.840097 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:04.840050 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-tls\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 16:00:04.840611 ip-10-0-135-9 kubenswrapper[2572]: E0422 16:00:04.840296 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 16:00:04.840611 ip-10-0-135-9 kubenswrapper[2572]: E0422 16:00:04.840314 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65cfbcb6dc-lbj9q: secret "image-registry-tls" not found Apr 22 16:00:04.840611 ip-10-0-135-9 kubenswrapper[2572]: E0422 16:00:04.840392 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-tls podName:bce0e47e-7873-4e15-8bf5-defeba101e19 nodeName:}" failed. No retries permitted until 2026-04-22 16:00:12.840372105 +0000 UTC m=+93.739926175 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-tls") pod "image-registry-65cfbcb6dc-lbj9q" (UID: "bce0e47e-7873-4e15-8bf5-defeba101e19") : secret "image-registry-tls" not found Apr 22 16:00:12.908901 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:12.908862 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-tls\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 16:00:12.912047 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:12.912017 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-tls\") pod \"image-registry-65cfbcb6dc-lbj9q\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 16:00:13.042823 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:13.042774 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 16:00:13.166478 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:13.166396 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-65cfbcb6dc-lbj9q"] Apr 22 16:00:13.169083 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:00:13.169044 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbce0e47e_7873_4e15_8bf5_defeba101e19.slice/crio-786a8014bf0718e978c01334493c8e11182709d122727719c9b10a4dc6de9996 WatchSource:0}: Error finding container 786a8014bf0718e978c01334493c8e11182709d122727719c9b10a4dc6de9996: Status 404 returned error can't find the container with id 786a8014bf0718e978c01334493c8e11182709d122727719c9b10a4dc6de9996 Apr 22 16:00:13.950554 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:13.950515 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" event={"ID":"bce0e47e-7873-4e15-8bf5-defeba101e19","Type":"ContainerStarted","Data":"4d3fe81c663aa979114f5821a719f122de85bf1198d8b7d9722cf944ed76831c"} Apr 22 16:00:13.950554 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:13.950553 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" event={"ID":"bce0e47e-7873-4e15-8bf5-defeba101e19","Type":"ContainerStarted","Data":"786a8014bf0718e978c01334493c8e11182709d122727719c9b10a4dc6de9996"} Apr 22 16:00:13.951091 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:13.950668 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 16:00:13.968598 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:13.968540 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" podStartSLOduration=16.968523431 podStartE2EDuration="16.968523431s" podCreationTimestamp="2026-04-22 15:59:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:00:13.967977457 +0000 UTC m=+94.867531530" watchObservedRunningTime="2026-04-22 16:00:13.968523431 +0000 UTC m=+94.868077504" Apr 22 16:00:16.838162 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:16.838123 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2e728dc-359a-4e6e-831e-f1b83f015c97-metrics-tls\") pod \"dns-default-nfknt\" (UID: \"f2e728dc-359a-4e6e-831e-f1b83f015c97\") " pod="openshift-dns/dns-default-nfknt" Apr 22 16:00:16.838162 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:16.838169 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11c29039-8c35-465e-8df3-408a688c08ed-cert\") pod \"ingress-canary-pcv67\" (UID: \"11c29039-8c35-465e-8df3-408a688c08ed\") " pod="openshift-ingress-canary/ingress-canary-pcv67" Apr 22 16:00:16.840678 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:16.840653 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2e728dc-359a-4e6e-831e-f1b83f015c97-metrics-tls\") pod \"dns-default-nfknt\" (UID: \"f2e728dc-359a-4e6e-831e-f1b83f015c97\") " pod="openshift-dns/dns-default-nfknt" Apr 22 16:00:16.840901 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:16.840880 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11c29039-8c35-465e-8df3-408a688c08ed-cert\") pod \"ingress-canary-pcv67\" (UID: \"11c29039-8c35-465e-8df3-408a688c08ed\") " pod="openshift-ingress-canary/ingress-canary-pcv67" Apr 22 16:00:16.911500 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:16.911463 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-lm8hj\"" Apr 22 16:00:16.920448 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:16.920424 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nfknt" Apr 22 16:00:16.924165 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:16.924142 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sbbp8\"" Apr 22 16:00:16.932908 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:16.932874 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pcv67" Apr 22 16:00:17.053645 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:17.053530 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nfknt"] Apr 22 16:00:17.056259 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:00:17.056226 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2e728dc_359a_4e6e_831e_f1b83f015c97.slice/crio-16aeb8995cd1a141a4fdd58ffc9198bcab97ba0bc3b6fe7415d03d71e451adb3 WatchSource:0}: Error finding container 16aeb8995cd1a141a4fdd58ffc9198bcab97ba0bc3b6fe7415d03d71e451adb3: Status 404 returned error can't find the container with id 16aeb8995cd1a141a4fdd58ffc9198bcab97ba0bc3b6fe7415d03d71e451adb3 Apr 22 16:00:17.069406 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:17.069372 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pcv67"] Apr 22 16:00:17.073549 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:00:17.073521 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11c29039_8c35_465e_8df3_408a688c08ed.slice/crio-4b4e30d744afdb86413e84053dae69d6ec4e5de772a9fd7af164167e9ec2d73a WatchSource:0}: Error finding container 4b4e30d744afdb86413e84053dae69d6ec4e5de772a9fd7af164167e9ec2d73a: Status 404 returned error can't find the container with id 4b4e30d744afdb86413e84053dae69d6ec4e5de772a9fd7af164167e9ec2d73a Apr 22 16:00:17.963113 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:17.963054 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pcv67" event={"ID":"11c29039-8c35-465e-8df3-408a688c08ed","Type":"ContainerStarted","Data":"4b4e30d744afdb86413e84053dae69d6ec4e5de772a9fd7af164167e9ec2d73a"} Apr 22 16:00:17.964342 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:17.964302 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nfknt" event={"ID":"f2e728dc-359a-4e6e-831e-f1b83f015c97","Type":"ContainerStarted","Data":"16aeb8995cd1a141a4fdd58ffc9198bcab97ba0bc3b6fe7415d03d71e451adb3"} Apr 22 16:00:19.973565 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:19.973525 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pcv67" event={"ID":"11c29039-8c35-465e-8df3-408a688c08ed","Type":"ContainerStarted","Data":"34a216e5aa04f6fd761f1c8017d17195a2dad5caf7acd4c61440ee50cc0f7e14"} Apr 22 16:00:19.975127 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:19.975098 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nfknt" event={"ID":"f2e728dc-359a-4e6e-831e-f1b83f015c97","Type":"ContainerStarted","Data":"4dc4d4fd73a7602bf4d873fb5717b73197e7faf950c7476e981ba99533042074"} Apr 22 16:00:19.975290 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:19.975136 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nfknt" event={"ID":"f2e728dc-359a-4e6e-831e-f1b83f015c97","Type":"ContainerStarted","Data":"5ef0f660bcb36512ca656b98b3ce7ae41eb407b95ff8399abf01ef7bdc594738"} Apr 22 16:00:19.975290 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:19.975238 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-nfknt" Apr 22 16:00:19.990979 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:19.990922 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-pcv67" podStartSLOduration=66.034042299 podStartE2EDuration="1m7.990906744s" podCreationTimestamp="2026-04-22 15:59:12 +0000 UTC" firstStartedPulling="2026-04-22 16:00:17.075407266 +0000 UTC m=+97.974961319" lastFinishedPulling="2026-04-22 16:00:19.032271716 +0000 UTC m=+99.931825764" observedRunningTime="2026-04-22 16:00:19.990226974 +0000 UTC m=+100.889781048" watchObservedRunningTime="2026-04-22 16:00:19.990906744 +0000 UTC m=+100.890460815" Apr 22 16:00:20.005240 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:20.005167 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-nfknt" podStartSLOduration=66.035961352 podStartE2EDuration="1m8.005150541s" podCreationTimestamp="2026-04-22 15:59:12 +0000 UTC" firstStartedPulling="2026-04-22 16:00:17.05830737 +0000 UTC m=+97.957861418" lastFinishedPulling="2026-04-22 16:00:19.027496545 +0000 UTC m=+99.927050607" observedRunningTime="2026-04-22 16:00:20.004875521 +0000 UTC m=+100.904429595" watchObservedRunningTime="2026-04-22 16:00:20.005150541 +0000 UTC m=+100.904704611" Apr 22 16:00:20.881448 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:20.881417 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-dqwgt" Apr 22 16:00:25.422594 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.422561 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48"] Apr 22 16:00:25.424891 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.424868 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" Apr 22 16:00:25.427346 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.427325 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 16:00:25.427468 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.427325 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 16:00:25.427575 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.427560 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 16:00:25.427796 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.427775 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 16:00:25.428364 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.428347 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 16:00:25.428503 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.428349 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 16:00:25.428503 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.428499 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 16:00:25.435385 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.435361 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48"] Apr 22 16:00:25.481915 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.481883 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-65cfbcb6dc-lbj9q"] Apr 22 16:00:25.503620 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.503579 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/1a3bf5fb-9189-40bc-b957-b3b60a109a0f-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-8555cc6597-mrm48\" (UID: \"1a3bf5fb-9189-40bc-b957-b3b60a109a0f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" Apr 22 16:00:25.503793 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.503643 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1a3bf5fb-9189-40bc-b957-b3b60a109a0f-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-8555cc6597-mrm48\" (UID: \"1a3bf5fb-9189-40bc-b957-b3b60a109a0f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" Apr 22 16:00:25.503793 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.503722 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/1a3bf5fb-9189-40bc-b957-b3b60a109a0f-ca\") pod \"cluster-proxy-proxy-agent-8555cc6597-mrm48\" (UID: \"1a3bf5fb-9189-40bc-b957-b3b60a109a0f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" Apr 22 16:00:25.503793 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.503773 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/1a3bf5fb-9189-40bc-b957-b3b60a109a0f-hub\") pod \"cluster-proxy-proxy-agent-8555cc6597-mrm48\" (UID: \"1a3bf5fb-9189-40bc-b957-b3b60a109a0f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" Apr 22 16:00:25.503911 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.503814 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k85rv\" (UniqueName: \"kubernetes.io/projected/1a3bf5fb-9189-40bc-b957-b3b60a109a0f-kube-api-access-k85rv\") pod \"cluster-proxy-proxy-agent-8555cc6597-mrm48\" (UID: \"1a3bf5fb-9189-40bc-b957-b3b60a109a0f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" Apr 22 16:00:25.503911 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.503865 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/1a3bf5fb-9189-40bc-b957-b3b60a109a0f-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-8555cc6597-mrm48\" (UID: \"1a3bf5fb-9189-40bc-b957-b3b60a109a0f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" Apr 22 16:00:25.519464 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.519428 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-rkfvs"] Apr 22 16:00:25.522696 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.522672 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rkfvs" Apr 22 16:00:25.540672 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.538601 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-lz7ck\"" Apr 22 16:00:25.540672 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.539111 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 16:00:25.540672 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.540288 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 16:00:25.542110 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.541982 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rkfvs"] Apr 22 16:00:25.605097 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.605060 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/50e48078-0749-491a-b7f0-fec2248f200a-data-volume\") pod \"insights-runtime-extractor-rkfvs\" (UID: \"50e48078-0749-491a-b7f0-fec2248f200a\") " pod="openshift-insights/insights-runtime-extractor-rkfvs" Apr 22 16:00:25.605097 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.605100 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/1a3bf5fb-9189-40bc-b957-b3b60a109a0f-hub\") pod \"cluster-proxy-proxy-agent-8555cc6597-mrm48\" (UID: \"1a3bf5fb-9189-40bc-b957-b3b60a109a0f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" Apr 22 16:00:25.605373 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.605121 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k85rv\" (UniqueName: \"kubernetes.io/projected/1a3bf5fb-9189-40bc-b957-b3b60a109a0f-kube-api-access-k85rv\") pod \"cluster-proxy-proxy-agent-8555cc6597-mrm48\" (UID: \"1a3bf5fb-9189-40bc-b957-b3b60a109a0f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" Apr 22 16:00:25.605373 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.605141 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/1a3bf5fb-9189-40bc-b957-b3b60a109a0f-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-8555cc6597-mrm48\" (UID: \"1a3bf5fb-9189-40bc-b957-b3b60a109a0f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" Apr 22 16:00:25.605373 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.605261 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/1a3bf5fb-9189-40bc-b957-b3b60a109a0f-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-8555cc6597-mrm48\" (UID: \"1a3bf5fb-9189-40bc-b957-b3b60a109a0f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" Apr 22 16:00:25.605373 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.605319 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/50e48078-0749-491a-b7f0-fec2248f200a-crio-socket\") pod \"insights-runtime-extractor-rkfvs\" (UID: \"50e48078-0749-491a-b7f0-fec2248f200a\") " pod="openshift-insights/insights-runtime-extractor-rkfvs" Apr 22 16:00:25.605373 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.605346 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/50e48078-0749-491a-b7f0-fec2248f200a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rkfvs\" (UID: \"50e48078-0749-491a-b7f0-fec2248f200a\") " pod="openshift-insights/insights-runtime-extractor-rkfvs" Apr 22 16:00:25.605620 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.605388 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh72v\" (UniqueName: \"kubernetes.io/projected/50e48078-0749-491a-b7f0-fec2248f200a-kube-api-access-kh72v\") pod \"insights-runtime-extractor-rkfvs\" (UID: \"50e48078-0749-491a-b7f0-fec2248f200a\") " pod="openshift-insights/insights-runtime-extractor-rkfvs" Apr 22 16:00:25.605620 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.605433 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1a3bf5fb-9189-40bc-b957-b3b60a109a0f-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-8555cc6597-mrm48\" (UID: \"1a3bf5fb-9189-40bc-b957-b3b60a109a0f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" Apr 22 16:00:25.605620 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.605481 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/1a3bf5fb-9189-40bc-b957-b3b60a109a0f-ca\") pod \"cluster-proxy-proxy-agent-8555cc6597-mrm48\" (UID: \"1a3bf5fb-9189-40bc-b957-b3b60a109a0f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" Apr 22 16:00:25.605620 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.605509 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/50e48078-0749-491a-b7f0-fec2248f200a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rkfvs\" (UID: \"50e48078-0749-491a-b7f0-fec2248f200a\") " pod="openshift-insights/insights-runtime-extractor-rkfvs" Apr 22 16:00:25.605920 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.605890 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/1a3bf5fb-9189-40bc-b957-b3b60a109a0f-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-8555cc6597-mrm48\" (UID: \"1a3bf5fb-9189-40bc-b957-b3b60a109a0f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" Apr 22 16:00:25.608102 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.608082 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1a3bf5fb-9189-40bc-b957-b3b60a109a0f-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-8555cc6597-mrm48\" (UID: \"1a3bf5fb-9189-40bc-b957-b3b60a109a0f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" Apr 22 16:00:25.608239 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.608160 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/1a3bf5fb-9189-40bc-b957-b3b60a109a0f-ca\") pod \"cluster-proxy-proxy-agent-8555cc6597-mrm48\" (UID: \"1a3bf5fb-9189-40bc-b957-b3b60a109a0f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" Apr 22 16:00:25.608239 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.608160 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/1a3bf5fb-9189-40bc-b957-b3b60a109a0f-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-8555cc6597-mrm48\" (UID: \"1a3bf5fb-9189-40bc-b957-b3b60a109a0f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" Apr 22 16:00:25.608364 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.608345 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/1a3bf5fb-9189-40bc-b957-b3b60a109a0f-hub\") pod \"cluster-proxy-proxy-agent-8555cc6597-mrm48\" (UID: \"1a3bf5fb-9189-40bc-b957-b3b60a109a0f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" Apr 22 16:00:25.613544 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.613520 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k85rv\" (UniqueName: \"kubernetes.io/projected/1a3bf5fb-9189-40bc-b957-b3b60a109a0f-kube-api-access-k85rv\") pod \"cluster-proxy-proxy-agent-8555cc6597-mrm48\" (UID: \"1a3bf5fb-9189-40bc-b957-b3b60a109a0f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" Apr 22 16:00:25.706032 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.705926 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/50e48078-0749-491a-b7f0-fec2248f200a-crio-socket\") pod \"insights-runtime-extractor-rkfvs\" (UID: \"50e48078-0749-491a-b7f0-fec2248f200a\") " pod="openshift-insights/insights-runtime-extractor-rkfvs" Apr 22 16:00:25.706032 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.705970 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/50e48078-0749-491a-b7f0-fec2248f200a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rkfvs\" (UID: \"50e48078-0749-491a-b7f0-fec2248f200a\") " pod="openshift-insights/insights-runtime-extractor-rkfvs" Apr 22 16:00:25.706032 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.705997 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kh72v\" (UniqueName: \"kubernetes.io/projected/50e48078-0749-491a-b7f0-fec2248f200a-kube-api-access-kh72v\") pod \"insights-runtime-extractor-rkfvs\" (UID: \"50e48078-0749-491a-b7f0-fec2248f200a\") " pod="openshift-insights/insights-runtime-extractor-rkfvs" Apr 22 16:00:25.706347 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.706039 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/50e48078-0749-491a-b7f0-fec2248f200a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rkfvs\" (UID: \"50e48078-0749-491a-b7f0-fec2248f200a\") " pod="openshift-insights/insights-runtime-extractor-rkfvs" Apr 22 16:00:25.706347 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.706068 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/50e48078-0749-491a-b7f0-fec2248f200a-data-volume\") pod \"insights-runtime-extractor-rkfvs\" (UID: \"50e48078-0749-491a-b7f0-fec2248f200a\") " pod="openshift-insights/insights-runtime-extractor-rkfvs" Apr 22 16:00:25.706347 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.706061 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/50e48078-0749-491a-b7f0-fec2248f200a-crio-socket\") pod \"insights-runtime-extractor-rkfvs\" (UID: \"50e48078-0749-491a-b7f0-fec2248f200a\") " pod="openshift-insights/insights-runtime-extractor-rkfvs" Apr 22 16:00:25.706511 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.706498 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/50e48078-0749-491a-b7f0-fec2248f200a-data-volume\") pod \"insights-runtime-extractor-rkfvs\" (UID: \"50e48078-0749-491a-b7f0-fec2248f200a\") " pod="openshift-insights/insights-runtime-extractor-rkfvs" Apr 22 16:00:25.706708 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.706686 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/50e48078-0749-491a-b7f0-fec2248f200a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rkfvs\" (UID: \"50e48078-0749-491a-b7f0-fec2248f200a\") " pod="openshift-insights/insights-runtime-extractor-rkfvs" Apr 22 16:00:25.708504 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.708485 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/50e48078-0749-491a-b7f0-fec2248f200a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rkfvs\" (UID: \"50e48078-0749-491a-b7f0-fec2248f200a\") " pod="openshift-insights/insights-runtime-extractor-rkfvs" Apr 22 16:00:25.715457 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.715426 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh72v\" (UniqueName: \"kubernetes.io/projected/50e48078-0749-491a-b7f0-fec2248f200a-kube-api-access-kh72v\") pod \"insights-runtime-extractor-rkfvs\" (UID: \"50e48078-0749-491a-b7f0-fec2248f200a\") " pod="openshift-insights/insights-runtime-extractor-rkfvs" Apr 22 16:00:25.745004 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.744965 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" Apr 22 16:00:25.852140 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.852111 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rkfvs" Apr 22 16:00:25.874174 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.874132 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48"] Apr 22 16:00:25.878750 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:00:25.878716 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a3bf5fb_9189_40bc_b957_b3b60a109a0f.slice/crio-1cab08f167a72ba61bcbf593d57d3b0c0d4c05829d8e98412b4a2bd1e5ba5fa9 WatchSource:0}: Error finding container 1cab08f167a72ba61bcbf593d57d3b0c0d4c05829d8e98412b4a2bd1e5ba5fa9: Status 404 returned error can't find the container with id 1cab08f167a72ba61bcbf593d57d3b0c0d4c05829d8e98412b4a2bd1e5ba5fa9 Apr 22 16:00:25.976865 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.976784 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rkfvs"] Apr 22 16:00:25.981022 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:00:25.980990 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50e48078_0749_491a_b7f0_fec2248f200a.slice/crio-89e810f35f483770dd2b6f1801e1e15d4982f92dc3c41d016afe036f75ddf106 WatchSource:0}: Error finding container 89e810f35f483770dd2b6f1801e1e15d4982f92dc3c41d016afe036f75ddf106: Status 404 returned error can't find the container with id 89e810f35f483770dd2b6f1801e1e15d4982f92dc3c41d016afe036f75ddf106 Apr 22 16:00:25.991210 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.991163 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rkfvs" event={"ID":"50e48078-0749-491a-b7f0-fec2248f200a","Type":"ContainerStarted","Data":"89e810f35f483770dd2b6f1801e1e15d4982f92dc3c41d016afe036f75ddf106"} Apr 22 16:00:25.992060 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:25.992025 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" event={"ID":"1a3bf5fb-9189-40bc-b957-b3b60a109a0f","Type":"ContainerStarted","Data":"1cab08f167a72ba61bcbf593d57d3b0c0d4c05829d8e98412b4a2bd1e5ba5fa9"} Apr 22 16:00:26.997010 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:26.996918 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rkfvs" event={"ID":"50e48078-0749-491a-b7f0-fec2248f200a","Type":"ContainerStarted","Data":"26cb438842ecb53e783036f593d9bc92ee5fdd7be8cdefe015e7e36362b5ab8e"} Apr 22 16:00:26.997010 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:26.996964 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rkfvs" event={"ID":"50e48078-0749-491a-b7f0-fec2248f200a","Type":"ContainerStarted","Data":"c3c228fc85a48271fa468d4c29e86003ae56e349a30d7b6fae270165b397f14f"} Apr 22 16:00:29.981102 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:29.981067 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-nfknt" Apr 22 16:00:30.007121 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:30.007080 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rkfvs" event={"ID":"50e48078-0749-491a-b7f0-fec2248f200a","Type":"ContainerStarted","Data":"b860251c7c408271d80fb7a747dfdcf2f56676205888940083d1e33738d1ace1"} Apr 22 16:00:30.009239 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:30.009183 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" event={"ID":"1a3bf5fb-9189-40bc-b957-b3b60a109a0f","Type":"ContainerStarted","Data":"e26961468f0f83a355b4c633f0df49c7f07ee79704100b52dbf28e9f7963bcb0"} Apr 22 16:00:30.024440 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:30.024148 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-rkfvs" podStartSLOduration=2.066936609 podStartE2EDuration="5.024129584s" podCreationTimestamp="2026-04-22 16:00:25 +0000 UTC" firstStartedPulling="2026-04-22 16:00:26.038415278 +0000 UTC m=+106.937969328" lastFinishedPulling="2026-04-22 16:00:28.995608252 +0000 UTC m=+109.895162303" observedRunningTime="2026-04-22 16:00:30.023904064 +0000 UTC m=+110.923458135" watchObservedRunningTime="2026-04-22 16:00:30.024129584 +0000 UTC m=+110.923683656" Apr 22 16:00:32.016207 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:32.016149 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" event={"ID":"1a3bf5fb-9189-40bc-b957-b3b60a109a0f","Type":"ContainerStarted","Data":"e6cc1833c14c75c909f558761ab02690f2a59c928ab24699c8726833afb25274"} Apr 22 16:00:32.016618 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:32.016215 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" event={"ID":"1a3bf5fb-9189-40bc-b957-b3b60a109a0f","Type":"ContainerStarted","Data":"e78d9e2ade1b46b88ea22ddd3eed1bf19847b0484ab6f63f4d905cb12c444315"} Apr 22 16:00:32.036212 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:32.036138 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8555cc6597-mrm48" podStartSLOduration=1.904442897 podStartE2EDuration="7.036118942s" podCreationTimestamp="2026-04-22 16:00:25 +0000 UTC" firstStartedPulling="2026-04-22 16:00:25.880994966 +0000 UTC m=+106.780549015" lastFinishedPulling="2026-04-22 16:00:31.012671007 +0000 UTC m=+111.912225060" observedRunningTime="2026-04-22 16:00:32.035758724 +0000 UTC m=+112.935312807" watchObservedRunningTime="2026-04-22 16:00:32.036118942 +0000 UTC m=+112.935673014" Apr 22 16:00:34.106317 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.106281 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-tqbzl"] Apr 22 16:00:34.109255 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.109226 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.111611 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.111585 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 16:00:34.111743 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.111669 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 16:00:34.111743 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.111719 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 16:00:34.111860 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.111785 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 16:00:34.112082 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.112065 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 16:00:34.112487 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.112472 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-4m4rr\"" Apr 22 16:00:34.112532 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.112493 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 16:00:34.176243 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.176207 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/26908f87-5ff2-4d32-a9a8-20e451a88f3b-node-exporter-wtmp\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.176425 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.176253 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/26908f87-5ff2-4d32-a9a8-20e451a88f3b-metrics-client-ca\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.176425 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.176276 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/26908f87-5ff2-4d32-a9a8-20e451a88f3b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.176425 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.176299 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/26908f87-5ff2-4d32-a9a8-20e451a88f3b-node-exporter-textfile\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.176425 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.176344 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/26908f87-5ff2-4d32-a9a8-20e451a88f3b-node-exporter-accelerators-collector-config\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.176425 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.176380 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4h4l\" (UniqueName: \"kubernetes.io/projected/26908f87-5ff2-4d32-a9a8-20e451a88f3b-kube-api-access-p4h4l\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.176425 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.176418 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26908f87-5ff2-4d32-a9a8-20e451a88f3b-sys\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.176617 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.176451 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/26908f87-5ff2-4d32-a9a8-20e451a88f3b-root\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.176617 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.176468 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/26908f87-5ff2-4d32-a9a8-20e451a88f3b-node-exporter-tls\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.277143 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.277103 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26908f87-5ff2-4d32-a9a8-20e451a88f3b-sys\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.277143 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.277145 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/26908f87-5ff2-4d32-a9a8-20e451a88f3b-root\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.277411 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.277167 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/26908f87-5ff2-4d32-a9a8-20e451a88f3b-node-exporter-tls\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.277411 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.277236 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26908f87-5ff2-4d32-a9a8-20e451a88f3b-sys\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.277411 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.277247 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/26908f87-5ff2-4d32-a9a8-20e451a88f3b-node-exporter-wtmp\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.277411 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.277257 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/26908f87-5ff2-4d32-a9a8-20e451a88f3b-root\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.277411 ip-10-0-135-9 kubenswrapper[2572]: E0422 16:00:34.277331 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 16:00:34.277411 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.277371 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/26908f87-5ff2-4d32-a9a8-20e451a88f3b-node-exporter-wtmp\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.277411 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.277329 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/26908f87-5ff2-4d32-a9a8-20e451a88f3b-metrics-client-ca\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.277411 ip-10-0-135-9 kubenswrapper[2572]: E0422 16:00:34.277390 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26908f87-5ff2-4d32-a9a8-20e451a88f3b-node-exporter-tls podName:26908f87-5ff2-4d32-a9a8-20e451a88f3b nodeName:}" failed. No retries permitted until 2026-04-22 16:00:34.777376064 +0000 UTC m=+115.676930114 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/26908f87-5ff2-4d32-a9a8-20e451a88f3b-node-exporter-tls") pod "node-exporter-tqbzl" (UID: "26908f87-5ff2-4d32-a9a8-20e451a88f3b") : secret "node-exporter-tls" not found Apr 22 16:00:34.277801 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.277420 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/26908f87-5ff2-4d32-a9a8-20e451a88f3b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.277801 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.277441 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/26908f87-5ff2-4d32-a9a8-20e451a88f3b-node-exporter-textfile\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.277801 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.277458 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/26908f87-5ff2-4d32-a9a8-20e451a88f3b-node-exporter-accelerators-collector-config\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.277801 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.277476 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4h4l\" (UniqueName: \"kubernetes.io/projected/26908f87-5ff2-4d32-a9a8-20e451a88f3b-kube-api-access-p4h4l\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.277801 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.277756 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/26908f87-5ff2-4d32-a9a8-20e451a88f3b-node-exporter-textfile\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.278049 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.277953 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/26908f87-5ff2-4d32-a9a8-20e451a88f3b-metrics-client-ca\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.278049 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.277960 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/26908f87-5ff2-4d32-a9a8-20e451a88f3b-node-exporter-accelerators-collector-config\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.279860 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.279842 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/26908f87-5ff2-4d32-a9a8-20e451a88f3b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.288207 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.288173 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4h4l\" (UniqueName: \"kubernetes.io/projected/26908f87-5ff2-4d32-a9a8-20e451a88f3b-kube-api-access-p4h4l\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.781137 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.781093 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/26908f87-5ff2-4d32-a9a8-20e451a88f3b-node-exporter-tls\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:34.783572 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:34.783550 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/26908f87-5ff2-4d32-a9a8-20e451a88f3b-node-exporter-tls\") pod \"node-exporter-tqbzl\" (UID: \"26908f87-5ff2-4d32-a9a8-20e451a88f3b\") " pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:35.018841 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:35.018796 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tqbzl" Apr 22 16:00:35.027438 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:00:35.027407 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26908f87_5ff2_4d32_a9a8_20e451a88f3b.slice/crio-b17d809a62ac1168e031205e879900f3c1fc4b11e9ef094bf75df9c8a3a291de WatchSource:0}: Error finding container b17d809a62ac1168e031205e879900f3c1fc4b11e9ef094bf75df9c8a3a291de: Status 404 returned error can't find the container with id b17d809a62ac1168e031205e879900f3c1fc4b11e9ef094bf75df9c8a3a291de Apr 22 16:00:35.487814 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:35.487787 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 16:00:36.032650 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:36.032612 2572 generic.go:358] "Generic (PLEG): container finished" podID="26908f87-5ff2-4d32-a9a8-20e451a88f3b" containerID="5a5310153105d1ebd0315f63352190dc0b14842d8cfd749369e6f0f52c0b3f6e" exitCode=0 Apr 22 16:00:36.032650 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:36.032654 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tqbzl" event={"ID":"26908f87-5ff2-4d32-a9a8-20e451a88f3b","Type":"ContainerDied","Data":"5a5310153105d1ebd0315f63352190dc0b14842d8cfd749369e6f0f52c0b3f6e"} Apr 22 16:00:36.032944 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:36.032686 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tqbzl" event={"ID":"26908f87-5ff2-4d32-a9a8-20e451a88f3b","Type":"ContainerStarted","Data":"b17d809a62ac1168e031205e879900f3c1fc4b11e9ef094bf75df9c8a3a291de"} Apr 22 16:00:37.036613 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:37.036578 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tqbzl" event={"ID":"26908f87-5ff2-4d32-a9a8-20e451a88f3b","Type":"ContainerStarted","Data":"f6ea7a541928255db1ca46e4ea5cf70e0b5481c53e034db32b30f79b7e6a1625"} Apr 22 16:00:37.036986 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:37.036621 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tqbzl" event={"ID":"26908f87-5ff2-4d32-a9a8-20e451a88f3b","Type":"ContainerStarted","Data":"551c870fcdbf5b3f66413e5871ef25160dbe2f5b973157a9897f065524685017"} Apr 22 16:00:49.390052 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:49.389997 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs\") pod \"network-metrics-daemon-76x4b\" (UID: \"c090a1ee-5091-44d6-9e1b-65bf4dc8b1be\") " pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 16:00:49.392593 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:49.392565 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c090a1ee-5091-44d6-9e1b-65bf4dc8b1be-metrics-certs\") pod \"network-metrics-daemon-76x4b\" (UID: \"c090a1ee-5091-44d6-9e1b-65bf4dc8b1be\") " pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 16:00:49.462778 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:49.462741 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-52jwv\"" Apr 22 16:00:49.471634 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:49.471606 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-76x4b" Apr 22 16:00:49.596484 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:49.596433 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-tqbzl" podStartSLOduration=14.804432637 podStartE2EDuration="15.596417649s" podCreationTimestamp="2026-04-22 16:00:34 +0000 UTC" firstStartedPulling="2026-04-22 16:00:35.029542823 +0000 UTC m=+115.929096872" lastFinishedPulling="2026-04-22 16:00:35.821527826 +0000 UTC m=+116.721081884" observedRunningTime="2026-04-22 16:00:37.05749325 +0000 UTC m=+117.957047323" watchObservedRunningTime="2026-04-22 16:00:49.596417649 +0000 UTC m=+130.495971719" Apr 22 16:00:49.596659 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:49.596576 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-76x4b"] Apr 22 16:00:49.599505 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:00:49.599475 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc090a1ee_5091_44d6_9e1b_65bf4dc8b1be.slice/crio-03d2e7a409f983b19f91148282182be5f57219328c3678512d2a7085d18e51d7 WatchSource:0}: Error finding container 03d2e7a409f983b19f91148282182be5f57219328c3678512d2a7085d18e51d7: Status 404 returned error can't find the container with id 03d2e7a409f983b19f91148282182be5f57219328c3678512d2a7085d18e51d7 Apr 22 16:00:50.075150 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:50.075113 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-76x4b" event={"ID":"c090a1ee-5091-44d6-9e1b-65bf4dc8b1be","Type":"ContainerStarted","Data":"03d2e7a409f983b19f91148282182be5f57219328c3678512d2a7085d18e51d7"} Apr 22 16:00:50.502839 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:50.502753 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" podUID="bce0e47e-7873-4e15-8bf5-defeba101e19" containerName="registry" containerID="cri-o://4d3fe81c663aa979114f5821a719f122de85bf1198d8b7d9722cf944ed76831c" gracePeriod=30 Apr 22 16:00:50.773921 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:50.773897 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 16:00:50.903057 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:50.903030 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bce0e47e-7873-4e15-8bf5-defeba101e19-installation-pull-secrets\") pod \"bce0e47e-7873-4e15-8bf5-defeba101e19\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " Apr 22 16:00:50.903253 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:50.903076 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-tls\") pod \"bce0e47e-7873-4e15-8bf5-defeba101e19\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " Apr 22 16:00:50.903253 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:50.903101 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bce0e47e-7873-4e15-8bf5-defeba101e19-trusted-ca\") pod \"bce0e47e-7873-4e15-8bf5-defeba101e19\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " Apr 22 16:00:50.903253 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:50.903230 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-bound-sa-token\") pod \"bce0e47e-7873-4e15-8bf5-defeba101e19\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " Apr 22 16:00:50.903423 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:50.903293 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq4p2\" (UniqueName: \"kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-kube-api-access-dq4p2\") pod \"bce0e47e-7873-4e15-8bf5-defeba101e19\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " Apr 22 16:00:50.903423 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:50.903342 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bce0e47e-7873-4e15-8bf5-defeba101e19-image-registry-private-configuration\") pod \"bce0e47e-7873-4e15-8bf5-defeba101e19\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " Apr 22 16:00:50.903423 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:50.903397 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-certificates\") pod \"bce0e47e-7873-4e15-8bf5-defeba101e19\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " Apr 22 16:00:50.903566 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:50.903424 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bce0e47e-7873-4e15-8bf5-defeba101e19-ca-trust-extracted\") pod \"bce0e47e-7873-4e15-8bf5-defeba101e19\" (UID: \"bce0e47e-7873-4e15-8bf5-defeba101e19\") " Apr 22 16:00:50.903566 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:50.903518 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bce0e47e-7873-4e15-8bf5-defeba101e19-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bce0e47e-7873-4e15-8bf5-defeba101e19" (UID: "bce0e47e-7873-4e15-8bf5-defeba101e19"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:00:50.903719 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:50.903641 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bce0e47e-7873-4e15-8bf5-defeba101e19-trusted-ca\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:00:50.904161 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:50.904109 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "bce0e47e-7873-4e15-8bf5-defeba101e19" (UID: "bce0e47e-7873-4e15-8bf5-defeba101e19"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:00:50.906012 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:50.905979 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "bce0e47e-7873-4e15-8bf5-defeba101e19" (UID: "bce0e47e-7873-4e15-8bf5-defeba101e19"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:00:50.906110 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:50.906077 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-kube-api-access-dq4p2" (OuterVolumeSpecName: "kube-api-access-dq4p2") pod "bce0e47e-7873-4e15-8bf5-defeba101e19" (UID: "bce0e47e-7873-4e15-8bf5-defeba101e19"). InnerVolumeSpecName "kube-api-access-dq4p2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:00:50.906530 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:50.906504 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bce0e47e-7873-4e15-8bf5-defeba101e19" (UID: "bce0e47e-7873-4e15-8bf5-defeba101e19"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:00:50.906700 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:50.906679 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce0e47e-7873-4e15-8bf5-defeba101e19-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "bce0e47e-7873-4e15-8bf5-defeba101e19" (UID: "bce0e47e-7873-4e15-8bf5-defeba101e19"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:00:50.907631 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:50.907610 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce0e47e-7873-4e15-8bf5-defeba101e19-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "bce0e47e-7873-4e15-8bf5-defeba101e19" (UID: "bce0e47e-7873-4e15-8bf5-defeba101e19"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:00:50.912405 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:50.912379 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bce0e47e-7873-4e15-8bf5-defeba101e19-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "bce0e47e-7873-4e15-8bf5-defeba101e19" (UID: "bce0e47e-7873-4e15-8bf5-defeba101e19"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:00:51.004590 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:51.004552 2572 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-certificates\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:00:51.004590 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:51.004581 2572 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bce0e47e-7873-4e15-8bf5-defeba101e19-ca-trust-extracted\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:00:51.004590 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:51.004590 2572 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bce0e47e-7873-4e15-8bf5-defeba101e19-installation-pull-secrets\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:00:51.004590 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:51.004600 2572 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-registry-tls\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:00:51.004864 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:51.004609 2572 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-bound-sa-token\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:00:51.004864 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:51.004617 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dq4p2\" (UniqueName: \"kubernetes.io/projected/bce0e47e-7873-4e15-8bf5-defeba101e19-kube-api-access-dq4p2\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:00:51.004864 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:51.004627 2572 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bce0e47e-7873-4e15-8bf5-defeba101e19-image-registry-private-configuration\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:00:51.079435 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:51.079395 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-76x4b" event={"ID":"c090a1ee-5091-44d6-9e1b-65bf4dc8b1be","Type":"ContainerStarted","Data":"509e8df2a5696db4fb7869fe5f54378940e6d1600d77a61d7a374600ad7a114c"} Apr 22 16:00:51.079611 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:51.079445 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-76x4b" event={"ID":"c090a1ee-5091-44d6-9e1b-65bf4dc8b1be","Type":"ContainerStarted","Data":"1039663c6e64d6a94f87366567724a2f8759cc6cc42c1c12da6b31122fa9e176"} Apr 22 16:00:51.080544 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:51.080520 2572 generic.go:358] "Generic (PLEG): container finished" podID="bce0e47e-7873-4e15-8bf5-defeba101e19" containerID="4d3fe81c663aa979114f5821a719f122de85bf1198d8b7d9722cf944ed76831c" exitCode=0 Apr 22 16:00:51.080625 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:51.080563 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" Apr 22 16:00:51.080625 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:51.080580 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" event={"ID":"bce0e47e-7873-4e15-8bf5-defeba101e19","Type":"ContainerDied","Data":"4d3fe81c663aa979114f5821a719f122de85bf1198d8b7d9722cf944ed76831c"} Apr 22 16:00:51.080625 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:51.080601 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65cfbcb6dc-lbj9q" event={"ID":"bce0e47e-7873-4e15-8bf5-defeba101e19","Type":"ContainerDied","Data":"786a8014bf0718e978c01334493c8e11182709d122727719c9b10a4dc6de9996"} Apr 22 16:00:51.080625 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:51.080615 2572 scope.go:117] "RemoveContainer" containerID="4d3fe81c663aa979114f5821a719f122de85bf1198d8b7d9722cf944ed76831c" Apr 22 16:00:51.088970 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:51.088937 2572 scope.go:117] "RemoveContainer" containerID="4d3fe81c663aa979114f5821a719f122de85bf1198d8b7d9722cf944ed76831c" Apr 22 16:00:51.089266 ip-10-0-135-9 kubenswrapper[2572]: E0422 16:00:51.089245 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d3fe81c663aa979114f5821a719f122de85bf1198d8b7d9722cf944ed76831c\": container with ID starting with 4d3fe81c663aa979114f5821a719f122de85bf1198d8b7d9722cf944ed76831c not found: ID does not exist" containerID="4d3fe81c663aa979114f5821a719f122de85bf1198d8b7d9722cf944ed76831c" Apr 22 16:00:51.089345 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:51.089279 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d3fe81c663aa979114f5821a719f122de85bf1198d8b7d9722cf944ed76831c"} err="failed to get container status \"4d3fe81c663aa979114f5821a719f122de85bf1198d8b7d9722cf944ed76831c\": rpc error: code = NotFound desc = could not find container \"4d3fe81c663aa979114f5821a719f122de85bf1198d8b7d9722cf944ed76831c\": container with ID starting with 4d3fe81c663aa979114f5821a719f122de85bf1198d8b7d9722cf944ed76831c not found: ID does not exist" Apr 22 16:00:51.095013 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:51.094966 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-76x4b" podStartSLOduration=131.048345024 podStartE2EDuration="2m12.094952374s" podCreationTimestamp="2026-04-22 15:58:39 +0000 UTC" firstStartedPulling="2026-04-22 16:00:49.601351817 +0000 UTC m=+130.500905870" lastFinishedPulling="2026-04-22 16:00:50.647959169 +0000 UTC m=+131.547513220" observedRunningTime="2026-04-22 16:00:51.093759709 +0000 UTC m=+131.993313782" watchObservedRunningTime="2026-04-22 16:00:51.094952374 +0000 UTC m=+131.994506445" Apr 22 16:00:51.111435 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:51.111378 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-65cfbcb6dc-lbj9q"] Apr 22 16:00:51.116244 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:51.113267 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-65cfbcb6dc-lbj9q"] Apr 22 16:00:51.643971 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:00:51.643935 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bce0e47e-7873-4e15-8bf5-defeba101e19" path="/var/lib/kubelet/pods/bce0e47e-7873-4e15-8bf5-defeba101e19/volumes" Apr 22 16:01:01.346260 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.346225 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-69687876bd-7x9jf"] Apr 22 16:01:01.346768 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.346638 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bce0e47e-7873-4e15-8bf5-defeba101e19" containerName="registry" Apr 22 16:01:01.346768 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.346657 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bce0e47e-7873-4e15-8bf5-defeba101e19" containerName="registry" Apr 22 16:01:01.346768 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.346737 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="bce0e47e-7873-4e15-8bf5-defeba101e19" containerName="registry" Apr 22 16:01:01.349227 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.349182 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:01.352508 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.352464 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 16:01:01.352685 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.352512 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 16:01:01.352685 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.352533 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 16:01:01.352685 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.352545 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 16:01:01.352685 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.352560 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 16:01:01.352685 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.352536 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 16:01:01.352685 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.352467 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-6gvpr\"" Apr 22 16:01:01.352685 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.352684 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 16:01:01.356942 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.356923 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69687876bd-7x9jf"] Apr 22 16:01:01.481158 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.481103 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p42ng\" (UniqueName: \"kubernetes.io/projected/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-kube-api-access-p42ng\") pod \"console-69687876bd-7x9jf\" (UID: \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\") " pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:01.481158 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.481164 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-console-oauth-config\") pod \"console-69687876bd-7x9jf\" (UID: \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\") " pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:01.481408 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.481253 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-service-ca\") pod \"console-69687876bd-7x9jf\" (UID: \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\") " pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:01.481408 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.481334 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-oauth-serving-cert\") pod \"console-69687876bd-7x9jf\" (UID: \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\") " pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:01.481408 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.481377 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-console-config\") pod \"console-69687876bd-7x9jf\" (UID: \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\") " pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:01.481507 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.481410 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-console-serving-cert\") pod \"console-69687876bd-7x9jf\" (UID: \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\") " pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:01.581907 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.581861 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-service-ca\") pod \"console-69687876bd-7x9jf\" (UID: \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\") " pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:01.582012 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.581916 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-oauth-serving-cert\") pod \"console-69687876bd-7x9jf\" (UID: \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\") " pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:01.582012 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.581939 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-console-config\") pod \"console-69687876bd-7x9jf\" (UID: \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\") " pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:01.582012 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.581961 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-console-serving-cert\") pod \"console-69687876bd-7x9jf\" (UID: \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\") " pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:01.582012 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.581989 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p42ng\" (UniqueName: \"kubernetes.io/projected/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-kube-api-access-p42ng\") pod \"console-69687876bd-7x9jf\" (UID: \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\") " pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:01.582012 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.582009 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-console-oauth-config\") pod \"console-69687876bd-7x9jf\" (UID: \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\") " pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:01.582778 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.582711 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-oauth-serving-cert\") pod \"console-69687876bd-7x9jf\" (UID: \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\") " pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:01.582778 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.582730 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-console-config\") pod \"console-69687876bd-7x9jf\" (UID: \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\") " pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:01.582778 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.582711 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-service-ca\") pod \"console-69687876bd-7x9jf\" (UID: \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\") " pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:01.584613 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.584593 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-console-oauth-config\") pod \"console-69687876bd-7x9jf\" (UID: \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\") " pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:01.584720 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.584633 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-console-serving-cert\") pod \"console-69687876bd-7x9jf\" (UID: \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\") " pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:01.590091 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.590060 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p42ng\" (UniqueName: \"kubernetes.io/projected/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-kube-api-access-p42ng\") pod \"console-69687876bd-7x9jf\" (UID: \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\") " pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:01.659327 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.659235 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:01.786757 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:01.786716 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69687876bd-7x9jf"] Apr 22 16:01:01.789405 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:01:01.789376 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20b3fd7e_7823_4dae_befc_3c9ff06bb3cf.slice/crio-b5e8b390ee4700b7a443cd090023716719d1a6f740b1069863945bfdde2f87e8 WatchSource:0}: Error finding container b5e8b390ee4700b7a443cd090023716719d1a6f740b1069863945bfdde2f87e8: Status 404 returned error can't find the container with id b5e8b390ee4700b7a443cd090023716719d1a6f740b1069863945bfdde2f87e8 Apr 22 16:01:02.112882 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:02.112847 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69687876bd-7x9jf" event={"ID":"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf","Type":"ContainerStarted","Data":"b5e8b390ee4700b7a443cd090023716719d1a6f740b1069863945bfdde2f87e8"} Apr 22 16:01:05.122792 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:05.122699 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69687876bd-7x9jf" event={"ID":"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf","Type":"ContainerStarted","Data":"a4a6c50f8cc0cde240d5ad61c316060511065e9a00d3599211224d8d9247b1a6"} Apr 22 16:01:05.138989 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:05.138922 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69687876bd-7x9jf" podStartSLOduration=1.090157892 podStartE2EDuration="4.138903194s" podCreationTimestamp="2026-04-22 16:01:01 +0000 UTC" firstStartedPulling="2026-04-22 16:01:01.791396905 +0000 UTC m=+142.690950957" lastFinishedPulling="2026-04-22 16:01:04.84014221 +0000 UTC m=+145.739696259" observedRunningTime="2026-04-22 16:01:05.13857192 +0000 UTC m=+146.038125992" watchObservedRunningTime="2026-04-22 16:01:05.138903194 +0000 UTC m=+146.038457266" Apr 22 16:01:11.659868 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:11.659815 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:11.659868 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:11.659879 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:11.664785 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:11.664762 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:12.145970 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:12.145941 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:14.148886 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:14.148850 2572 generic.go:358] "Generic (PLEG): container finished" podID="8d7ddd84-35ed-400b-ad69-647f50964d8c" containerID="ac49a8404b39aa902a0ada07c259fc70860b9483e29bdadf43278dfcce94c8ca" exitCode=0 Apr 22 16:01:14.149296 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:14.148922 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j58f5" event={"ID":"8d7ddd84-35ed-400b-ad69-647f50964d8c","Type":"ContainerDied","Data":"ac49a8404b39aa902a0ada07c259fc70860b9483e29bdadf43278dfcce94c8ca"} Apr 22 16:01:14.149296 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:14.149276 2572 scope.go:117] "RemoveContainer" containerID="ac49a8404b39aa902a0ada07c259fc70860b9483e29bdadf43278dfcce94c8ca" Apr 22 16:01:15.154292 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:15.153244 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j58f5" event={"ID":"8d7ddd84-35ed-400b-ad69-647f50964d8c","Type":"ContainerStarted","Data":"69c538f0bbba984fb56304e39154db2161c0a78929225b8773b3048a3e32f036"} Apr 22 16:01:15.159171 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:15.159092 2572 generic.go:358] "Generic (PLEG): container finished" podID="59310538-efb1-4059-a9c2-3dac6061df75" containerID="c9c2a80f899a605505b8cbcd156d0a1f74e7c364e948a18d6753d7462231342d" exitCode=0 Apr 22 16:01:15.159171 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:15.159134 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rm52c" event={"ID":"59310538-efb1-4059-a9c2-3dac6061df75","Type":"ContainerDied","Data":"c9c2a80f899a605505b8cbcd156d0a1f74e7c364e948a18d6753d7462231342d"} Apr 22 16:01:15.159509 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:15.159492 2572 scope.go:117] "RemoveContainer" containerID="c9c2a80f899a605505b8cbcd156d0a1f74e7c364e948a18d6753d7462231342d" Apr 22 16:01:16.164075 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:16.164039 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rm52c" event={"ID":"59310538-efb1-4059-a9c2-3dac6061df75","Type":"ContainerStarted","Data":"6eadc1f54fb1e34986611cba0006e28c5102b8b1ac7dcbecef943e6834e10821"} Apr 22 16:01:17.167789 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:17.167754 2572 generic.go:358] "Generic (PLEG): container finished" podID="14ff43e0-e359-4557-9f79-d5452a8479a0" containerID="4eff5c411b6e1b25f382feee3ed7a49ce01379fcd3b9cf4c4a4a4b194d7b3774" exitCode=0 Apr 22 16:01:17.168229 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:17.167795 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-jbznj" event={"ID":"14ff43e0-e359-4557-9f79-d5452a8479a0","Type":"ContainerDied","Data":"4eff5c411b6e1b25f382feee3ed7a49ce01379fcd3b9cf4c4a4a4b194d7b3774"} Apr 22 16:01:17.168229 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:17.168147 2572 scope.go:117] "RemoveContainer" containerID="4eff5c411b6e1b25f382feee3ed7a49ce01379fcd3b9cf4c4a4a4b194d7b3774" Apr 22 16:01:18.172020 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:18.171985 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-jbznj" event={"ID":"14ff43e0-e359-4557-9f79-d5452a8479a0","Type":"ContainerStarted","Data":"5239d957f2b4479a984ca9b061fb788482e5a79e11ae6d63089b0443546ab0e9"} Apr 22 16:01:22.165289 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:22.165247 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69687876bd-7x9jf"] Apr 22 16:01:47.185261 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:47.185184 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-69687876bd-7x9jf" podUID="20b3fd7e-7823-4dae-befc-3c9ff06bb3cf" containerName="console" containerID="cri-o://a4a6c50f8cc0cde240d5ad61c316060511065e9a00d3599211224d8d9247b1a6" gracePeriod=15 Apr 22 16:01:47.416423 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:47.416397 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69687876bd-7x9jf_20b3fd7e-7823-4dae-befc-3c9ff06bb3cf/console/0.log" Apr 22 16:01:47.416568 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:47.416470 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:47.533525 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:47.533430 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p42ng\" (UniqueName: \"kubernetes.io/projected/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-kube-api-access-p42ng\") pod \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\" (UID: \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\") " Apr 22 16:01:47.533525 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:47.533466 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-console-oauth-config\") pod \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\" (UID: \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\") " Apr 22 16:01:47.533525 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:47.533505 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-console-config\") pod \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\" (UID: \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\") " Apr 22 16:01:47.533784 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:47.533543 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-console-serving-cert\") pod \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\" (UID: \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\") " Apr 22 16:01:47.533784 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:47.533571 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-service-ca\") pod \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\" (UID: \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\") " Apr 22 16:01:47.533784 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:47.533622 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-oauth-serving-cert\") pod \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\" (UID: \"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf\") " Apr 22 16:01:47.534072 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:47.534033 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-console-config" (OuterVolumeSpecName: "console-config") pod "20b3fd7e-7823-4dae-befc-3c9ff06bb3cf" (UID: "20b3fd7e-7823-4dae-befc-3c9ff06bb3cf"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:01:47.534072 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:47.534050 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-service-ca" (OuterVolumeSpecName: "service-ca") pod "20b3fd7e-7823-4dae-befc-3c9ff06bb3cf" (UID: "20b3fd7e-7823-4dae-befc-3c9ff06bb3cf"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:01:47.534072 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:47.534042 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "20b3fd7e-7823-4dae-befc-3c9ff06bb3cf" (UID: "20b3fd7e-7823-4dae-befc-3c9ff06bb3cf"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:01:47.536015 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:47.535995 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "20b3fd7e-7823-4dae-befc-3c9ff06bb3cf" (UID: "20b3fd7e-7823-4dae-befc-3c9ff06bb3cf"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:01:47.536310 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:47.536289 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "20b3fd7e-7823-4dae-befc-3c9ff06bb3cf" (UID: "20b3fd7e-7823-4dae-befc-3c9ff06bb3cf"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:01:47.536398 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:47.536305 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-kube-api-access-p42ng" (OuterVolumeSpecName: "kube-api-access-p42ng") pod "20b3fd7e-7823-4dae-befc-3c9ff06bb3cf" (UID: "20b3fd7e-7823-4dae-befc-3c9ff06bb3cf"). InnerVolumeSpecName "kube-api-access-p42ng". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:01:47.634785 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:47.634729 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-oauth-serving-cert\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:01:47.634785 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:47.634774 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p42ng\" (UniqueName: \"kubernetes.io/projected/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-kube-api-access-p42ng\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:01:47.634785 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:47.634785 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-console-oauth-config\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:01:47.634785 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:47.634795 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-console-config\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:01:47.634785 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:47.634805 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-console-serving-cert\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:01:47.635086 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:47.634815 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf-service-ca\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:01:48.255999 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:48.255973 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69687876bd-7x9jf_20b3fd7e-7823-4dae-befc-3c9ff06bb3cf/console/0.log" Apr 22 16:01:48.256407 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:48.256013 2572 generic.go:358] "Generic (PLEG): container finished" podID="20b3fd7e-7823-4dae-befc-3c9ff06bb3cf" containerID="a4a6c50f8cc0cde240d5ad61c316060511065e9a00d3599211224d8d9247b1a6" exitCode=2 Apr 22 16:01:48.256407 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:48.256078 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69687876bd-7x9jf" Apr 22 16:01:48.256407 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:48.256100 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69687876bd-7x9jf" event={"ID":"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf","Type":"ContainerDied","Data":"a4a6c50f8cc0cde240d5ad61c316060511065e9a00d3599211224d8d9247b1a6"} Apr 22 16:01:48.256407 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:48.256150 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69687876bd-7x9jf" event={"ID":"20b3fd7e-7823-4dae-befc-3c9ff06bb3cf","Type":"ContainerDied","Data":"b5e8b390ee4700b7a443cd090023716719d1a6f740b1069863945bfdde2f87e8"} Apr 22 16:01:48.256407 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:48.256171 2572 scope.go:117] "RemoveContainer" containerID="a4a6c50f8cc0cde240d5ad61c316060511065e9a00d3599211224d8d9247b1a6" Apr 22 16:01:48.264099 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:48.264067 2572 scope.go:117] "RemoveContainer" containerID="a4a6c50f8cc0cde240d5ad61c316060511065e9a00d3599211224d8d9247b1a6" Apr 22 16:01:48.264394 ip-10-0-135-9 kubenswrapper[2572]: E0422 16:01:48.264372 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4a6c50f8cc0cde240d5ad61c316060511065e9a00d3599211224d8d9247b1a6\": container with ID starting with a4a6c50f8cc0cde240d5ad61c316060511065e9a00d3599211224d8d9247b1a6 not found: ID does not exist" containerID="a4a6c50f8cc0cde240d5ad61c316060511065e9a00d3599211224d8d9247b1a6" Apr 22 16:01:48.264497 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:48.264400 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4a6c50f8cc0cde240d5ad61c316060511065e9a00d3599211224d8d9247b1a6"} err="failed to get container status \"a4a6c50f8cc0cde240d5ad61c316060511065e9a00d3599211224d8d9247b1a6\": rpc error: code = NotFound desc = could not find container \"a4a6c50f8cc0cde240d5ad61c316060511065e9a00d3599211224d8d9247b1a6\": container with ID starting with a4a6c50f8cc0cde240d5ad61c316060511065e9a00d3599211224d8d9247b1a6 not found: ID does not exist" Apr 22 16:01:48.273773 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:48.273746 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69687876bd-7x9jf"] Apr 22 16:01:48.277035 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:48.277009 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-69687876bd-7x9jf"] Apr 22 16:01:49.644834 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:49.644793 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b3fd7e-7823-4dae-befc-3c9ff06bb3cf" path="/var/lib/kubelet/pods/20b3fd7e-7823-4dae-befc-3c9ff06bb3cf/volumes" Apr 22 16:01:57.619014 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.618978 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-68f4f8fc6-jkg98"] Apr 22 16:01:57.619491 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.619260 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20b3fd7e-7823-4dae-befc-3c9ff06bb3cf" containerName="console" Apr 22 16:01:57.619491 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.619272 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b3fd7e-7823-4dae-befc-3c9ff06bb3cf" containerName="console" Apr 22 16:01:57.619491 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.619326 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="20b3fd7e-7823-4dae-befc-3c9ff06bb3cf" containerName="console" Apr 22 16:01:57.621095 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.621078 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:01:57.624593 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.624566 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 16:01:57.624731 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.624604 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-6gvpr\"" Apr 22 16:01:57.624731 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.624651 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 16:01:57.624731 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.624700 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 16:01:57.624731 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.624577 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 16:01:57.624906 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.624784 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 16:01:57.625095 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.625078 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 16:01:57.625158 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.625082 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 16:01:57.629579 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.629558 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 16:01:57.633062 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.633039 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68f4f8fc6-jkg98"] Apr 22 16:01:57.711146 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.711104 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24164a60-dfe4-4424-a7dd-ce0172e05f3a-trusted-ca-bundle\") pod \"console-68f4f8fc6-jkg98\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:01:57.711146 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.711147 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/24164a60-dfe4-4424-a7dd-ce0172e05f3a-console-config\") pod \"console-68f4f8fc6-jkg98\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:01:57.711425 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.711176 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24164a60-dfe4-4424-a7dd-ce0172e05f3a-service-ca\") pod \"console-68f4f8fc6-jkg98\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:01:57.711425 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.711255 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/24164a60-dfe4-4424-a7dd-ce0172e05f3a-oauth-serving-cert\") pod \"console-68f4f8fc6-jkg98\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:01:57.711425 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.711291 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/24164a60-dfe4-4424-a7dd-ce0172e05f3a-console-oauth-config\") pod \"console-68f4f8fc6-jkg98\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:01:57.711425 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.711322 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzn29\" (UniqueName: \"kubernetes.io/projected/24164a60-dfe4-4424-a7dd-ce0172e05f3a-kube-api-access-xzn29\") pod \"console-68f4f8fc6-jkg98\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:01:57.711425 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.711354 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/24164a60-dfe4-4424-a7dd-ce0172e05f3a-console-serving-cert\") pod \"console-68f4f8fc6-jkg98\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:01:57.812486 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.812446 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24164a60-dfe4-4424-a7dd-ce0172e05f3a-trusted-ca-bundle\") pod \"console-68f4f8fc6-jkg98\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:01:57.812486 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.812490 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/24164a60-dfe4-4424-a7dd-ce0172e05f3a-console-config\") pod \"console-68f4f8fc6-jkg98\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:01:57.812712 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.812516 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24164a60-dfe4-4424-a7dd-ce0172e05f3a-service-ca\") pod \"console-68f4f8fc6-jkg98\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:01:57.812758 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.812721 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/24164a60-dfe4-4424-a7dd-ce0172e05f3a-oauth-serving-cert\") pod \"console-68f4f8fc6-jkg98\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:01:57.812802 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.812783 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/24164a60-dfe4-4424-a7dd-ce0172e05f3a-console-oauth-config\") pod \"console-68f4f8fc6-jkg98\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:01:57.812853 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.812819 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzn29\" (UniqueName: \"kubernetes.io/projected/24164a60-dfe4-4424-a7dd-ce0172e05f3a-kube-api-access-xzn29\") pod \"console-68f4f8fc6-jkg98\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:01:57.812910 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.812859 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/24164a60-dfe4-4424-a7dd-ce0172e05f3a-console-serving-cert\") pod \"console-68f4f8fc6-jkg98\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:01:57.813354 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.813318 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/24164a60-dfe4-4424-a7dd-ce0172e05f3a-console-config\") pod \"console-68f4f8fc6-jkg98\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:01:57.813354 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.813341 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24164a60-dfe4-4424-a7dd-ce0172e05f3a-service-ca\") pod \"console-68f4f8fc6-jkg98\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:01:57.813532 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.813468 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/24164a60-dfe4-4424-a7dd-ce0172e05f3a-oauth-serving-cert\") pod \"console-68f4f8fc6-jkg98\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:01:57.813532 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.813491 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24164a60-dfe4-4424-a7dd-ce0172e05f3a-trusted-ca-bundle\") pod \"console-68f4f8fc6-jkg98\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:01:57.815427 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.815406 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/24164a60-dfe4-4424-a7dd-ce0172e05f3a-console-oauth-config\") pod \"console-68f4f8fc6-jkg98\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:01:57.815521 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.815504 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/24164a60-dfe4-4424-a7dd-ce0172e05f3a-console-serving-cert\") pod \"console-68f4f8fc6-jkg98\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:01:57.820559 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.820534 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzn29\" (UniqueName: \"kubernetes.io/projected/24164a60-dfe4-4424-a7dd-ce0172e05f3a-kube-api-access-xzn29\") pod \"console-68f4f8fc6-jkg98\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:01:57.931234 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:57.931123 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:01:58.051253 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:58.051221 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68f4f8fc6-jkg98"] Apr 22 16:01:58.054932 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:01:58.054899 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24164a60_dfe4_4424_a7dd_ce0172e05f3a.slice/crio-7dac0f9a8a830650099ba3e84a06aa200f3c3b00d4fd6f54e9920f3d1cb2c40a WatchSource:0}: Error finding container 7dac0f9a8a830650099ba3e84a06aa200f3c3b00d4fd6f54e9920f3d1cb2c40a: Status 404 returned error can't find the container with id 7dac0f9a8a830650099ba3e84a06aa200f3c3b00d4fd6f54e9920f3d1cb2c40a Apr 22 16:01:58.286981 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:58.286883 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68f4f8fc6-jkg98" event={"ID":"24164a60-dfe4-4424-a7dd-ce0172e05f3a","Type":"ContainerStarted","Data":"aea7be47de80212d6f511dfbbae3e5119af2977e917d51e6585b5bbf3d4deab3"} Apr 22 16:01:58.286981 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:58.286931 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68f4f8fc6-jkg98" event={"ID":"24164a60-dfe4-4424-a7dd-ce0172e05f3a","Type":"ContainerStarted","Data":"7dac0f9a8a830650099ba3e84a06aa200f3c3b00d4fd6f54e9920f3d1cb2c40a"} Apr 22 16:01:58.306036 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:01:58.305817 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68f4f8fc6-jkg98" podStartSLOduration=1.305797293 podStartE2EDuration="1.305797293s" podCreationTimestamp="2026-04-22 16:01:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:01:58.304864665 +0000 UTC m=+199.204418736" watchObservedRunningTime="2026-04-22 16:01:58.305797293 +0000 UTC m=+199.205351364" Apr 22 16:02:07.931635 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:02:07.931592 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:02:07.931635 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:02:07.931645 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:02:07.936602 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:02:07.936578 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:02:08.320534 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:02:08.320451 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:03:39.520403 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:03:39.520374 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxznf_fd3073fe-435c-4974-821b-9229018bf5f4/ovn-acl-logging/0.log" Apr 22 16:03:39.522725 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:03:39.522701 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxznf_fd3073fe-435c-4974-821b-9229018bf5f4/ovn-acl-logging/0.log" Apr 22 16:03:39.526800 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:03:39.526785 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 16:05:02.816381 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:02.816345 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm"] Apr 22 16:05:02.818496 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:02.818475 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm" Apr 22 16:05:02.820810 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:02.820789 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-jvd26\"" Apr 22 16:05:02.821125 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:02.821108 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 16:05:02.821651 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:02.821633 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 16:05:02.826948 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:02.826905 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm"] Apr 22 16:05:02.952580 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:02.952541 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd456e73-c30c-43b2-beba-678dda4a23c0-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm\" (UID: \"fd456e73-c30c-43b2-beba-678dda4a23c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm" Apr 22 16:05:02.952580 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:02.952579 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4sjg\" (UniqueName: \"kubernetes.io/projected/fd456e73-c30c-43b2-beba-678dda4a23c0-kube-api-access-z4sjg\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm\" (UID: \"fd456e73-c30c-43b2-beba-678dda4a23c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm" Apr 22 16:05:02.952818 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:02.952624 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd456e73-c30c-43b2-beba-678dda4a23c0-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm\" (UID: \"fd456e73-c30c-43b2-beba-678dda4a23c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm" Apr 22 16:05:03.053613 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:03.053562 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd456e73-c30c-43b2-beba-678dda4a23c0-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm\" (UID: \"fd456e73-c30c-43b2-beba-678dda4a23c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm" Apr 22 16:05:03.053784 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:03.053629 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd456e73-c30c-43b2-beba-678dda4a23c0-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm\" (UID: \"fd456e73-c30c-43b2-beba-678dda4a23c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm" Apr 22 16:05:03.053784 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:03.053722 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4sjg\" (UniqueName: \"kubernetes.io/projected/fd456e73-c30c-43b2-beba-678dda4a23c0-kube-api-access-z4sjg\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm\" (UID: \"fd456e73-c30c-43b2-beba-678dda4a23c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm" Apr 22 16:05:03.053942 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:03.053927 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd456e73-c30c-43b2-beba-678dda4a23c0-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm\" (UID: \"fd456e73-c30c-43b2-beba-678dda4a23c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm" Apr 22 16:05:03.053989 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:03.053972 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd456e73-c30c-43b2-beba-678dda4a23c0-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm\" (UID: \"fd456e73-c30c-43b2-beba-678dda4a23c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm" Apr 22 16:05:03.066553 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:03.066486 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4sjg\" (UniqueName: \"kubernetes.io/projected/fd456e73-c30c-43b2-beba-678dda4a23c0-kube-api-access-z4sjg\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm\" (UID: \"fd456e73-c30c-43b2-beba-678dda4a23c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm" Apr 22 16:05:03.128333 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:03.128297 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm" Apr 22 16:05:03.257304 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:03.257254 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm"] Apr 22 16:05:03.260163 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:05:03.260131 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd456e73_c30c_43b2_beba_678dda4a23c0.slice/crio-6be5875aeff2b1db41f779825753d93c6e311624351694a19548e7f428709623 WatchSource:0}: Error finding container 6be5875aeff2b1db41f779825753d93c6e311624351694a19548e7f428709623: Status 404 returned error can't find the container with id 6be5875aeff2b1db41f779825753d93c6e311624351694a19548e7f428709623 Apr 22 16:05:03.261976 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:03.261962 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 16:05:03.770331 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:03.770290 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm" event={"ID":"fd456e73-c30c-43b2-beba-678dda4a23c0","Type":"ContainerStarted","Data":"6be5875aeff2b1db41f779825753d93c6e311624351694a19548e7f428709623"} Apr 22 16:05:09.788779 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:09.788732 2572 generic.go:358] "Generic (PLEG): container finished" podID="fd456e73-c30c-43b2-beba-678dda4a23c0" containerID="8b0fec8a696cd48ec423d3560dfe62a72dd70710b749e14c3efd245f34ed5cbd" exitCode=0 Apr 22 16:05:09.789233 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:09.788805 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm" event={"ID":"fd456e73-c30c-43b2-beba-678dda4a23c0","Type":"ContainerDied","Data":"8b0fec8a696cd48ec423d3560dfe62a72dd70710b749e14c3efd245f34ed5cbd"} Apr 22 16:05:10.521396 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:10.521361 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-8648p"] Apr 22 16:05:10.523917 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:10.523895 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-8648p" Apr 22 16:05:10.527068 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:10.527040 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 16:05:10.527215 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:10.527040 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 16:05:10.527215 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:10.527086 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-npxrt\"" Apr 22 16:05:10.533872 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:10.533843 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-8648p"] Apr 22 16:05:10.621134 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:10.621100 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62g7d\" (UniqueName: \"kubernetes.io/projected/526173ee-30b3-484b-986c-ff41e7117a61-kube-api-access-62g7d\") pod \"cert-manager-webhook-587ccfb98-8648p\" (UID: \"526173ee-30b3-484b-986c-ff41e7117a61\") " pod="cert-manager/cert-manager-webhook-587ccfb98-8648p" Apr 22 16:05:10.621134 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:10.621145 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/526173ee-30b3-484b-986c-ff41e7117a61-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-8648p\" (UID: \"526173ee-30b3-484b-986c-ff41e7117a61\") " pod="cert-manager/cert-manager-webhook-587ccfb98-8648p" Apr 22 16:05:10.722585 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:10.722539 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62g7d\" (UniqueName: \"kubernetes.io/projected/526173ee-30b3-484b-986c-ff41e7117a61-kube-api-access-62g7d\") pod \"cert-manager-webhook-587ccfb98-8648p\" (UID: \"526173ee-30b3-484b-986c-ff41e7117a61\") " pod="cert-manager/cert-manager-webhook-587ccfb98-8648p" Apr 22 16:05:10.722585 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:10.722586 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/526173ee-30b3-484b-986c-ff41e7117a61-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-8648p\" (UID: \"526173ee-30b3-484b-986c-ff41e7117a61\") " pod="cert-manager/cert-manager-webhook-587ccfb98-8648p" Apr 22 16:05:10.730749 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:10.730713 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/526173ee-30b3-484b-986c-ff41e7117a61-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-8648p\" (UID: \"526173ee-30b3-484b-986c-ff41e7117a61\") " pod="cert-manager/cert-manager-webhook-587ccfb98-8648p" Apr 22 16:05:10.730908 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:10.730820 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62g7d\" (UniqueName: \"kubernetes.io/projected/526173ee-30b3-484b-986c-ff41e7117a61-kube-api-access-62g7d\") pod \"cert-manager-webhook-587ccfb98-8648p\" (UID: \"526173ee-30b3-484b-986c-ff41e7117a61\") " pod="cert-manager/cert-manager-webhook-587ccfb98-8648p" Apr 22 16:05:10.834726 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:10.834687 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-8648p" Apr 22 16:05:10.981828 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:10.981789 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-8648p"] Apr 22 16:05:10.986088 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:05:10.986057 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod526173ee_30b3_484b_986c_ff41e7117a61.slice/crio-a2ee1cf28edea5e98f765897edd6f381b3b8caf64c1e4e94afcc0251a896f358 WatchSource:0}: Error finding container a2ee1cf28edea5e98f765897edd6f381b3b8caf64c1e4e94afcc0251a896f358: Status 404 returned error can't find the container with id a2ee1cf28edea5e98f765897edd6f381b3b8caf64c1e4e94afcc0251a896f358 Apr 22 16:05:11.795405 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:11.795367 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-8648p" event={"ID":"526173ee-30b3-484b-986c-ff41e7117a61","Type":"ContainerStarted","Data":"a2ee1cf28edea5e98f765897edd6f381b3b8caf64c1e4e94afcc0251a896f358"} Apr 22 16:05:12.801093 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:12.800977 2572 generic.go:358] "Generic (PLEG): container finished" podID="fd456e73-c30c-43b2-beba-678dda4a23c0" containerID="a8f136125ae4898a8210e1bbf4e41bbc405228122b83c61b434c039531d284b8" exitCode=0 Apr 22 16:05:12.801093 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:12.801069 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm" event={"ID":"fd456e73-c30c-43b2-beba-678dda4a23c0","Type":"ContainerDied","Data":"a8f136125ae4898a8210e1bbf4e41bbc405228122b83c61b434c039531d284b8"} Apr 22 16:05:14.808133 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:14.808096 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-8648p" event={"ID":"526173ee-30b3-484b-986c-ff41e7117a61","Type":"ContainerStarted","Data":"84388ef4a4be6291d64198d8a75abbb9b438a7ee623f2b41e1bb9357b6370f64"} Apr 22 16:05:14.808543 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:14.808239 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-8648p" Apr 22 16:05:14.825102 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:14.825041 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-8648p" podStartSLOduration=1.6043939329999999 podStartE2EDuration="4.825022725s" podCreationTimestamp="2026-04-22 16:05:10 +0000 UTC" firstStartedPulling="2026-04-22 16:05:10.988633405 +0000 UTC m=+391.888187458" lastFinishedPulling="2026-04-22 16:05:14.209262201 +0000 UTC m=+395.108816250" observedRunningTime="2026-04-22 16:05:14.82439212 +0000 UTC m=+395.723946192" watchObservedRunningTime="2026-04-22 16:05:14.825022725 +0000 UTC m=+395.724576796" Apr 22 16:05:20.813531 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:20.813504 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-8648p" Apr 22 16:05:20.826384 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:20.826350 2572 generic.go:358] "Generic (PLEG): container finished" podID="fd456e73-c30c-43b2-beba-678dda4a23c0" containerID="485a5e3a61c7bff3e38fad30422f249235b9812122a1c27650f66ca8ba12b7b8" exitCode=0 Apr 22 16:05:20.826586 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:20.826448 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm" event={"ID":"fd456e73-c30c-43b2-beba-678dda4a23c0","Type":"ContainerDied","Data":"485a5e3a61c7bff3e38fad30422f249235b9812122a1c27650f66ca8ba12b7b8"} Apr 22 16:05:21.951898 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:21.951872 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm" Apr 22 16:05:22.124467 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:22.124380 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd456e73-c30c-43b2-beba-678dda4a23c0-util\") pod \"fd456e73-c30c-43b2-beba-678dda4a23c0\" (UID: \"fd456e73-c30c-43b2-beba-678dda4a23c0\") " Apr 22 16:05:22.124467 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:22.124432 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4sjg\" (UniqueName: \"kubernetes.io/projected/fd456e73-c30c-43b2-beba-678dda4a23c0-kube-api-access-z4sjg\") pod \"fd456e73-c30c-43b2-beba-678dda4a23c0\" (UID: \"fd456e73-c30c-43b2-beba-678dda4a23c0\") " Apr 22 16:05:22.124668 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:22.124503 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd456e73-c30c-43b2-beba-678dda4a23c0-bundle\") pod \"fd456e73-c30c-43b2-beba-678dda4a23c0\" (UID: \"fd456e73-c30c-43b2-beba-678dda4a23c0\") " Apr 22 16:05:22.124904 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:22.124870 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd456e73-c30c-43b2-beba-678dda4a23c0-bundle" (OuterVolumeSpecName: "bundle") pod "fd456e73-c30c-43b2-beba-678dda4a23c0" (UID: "fd456e73-c30c-43b2-beba-678dda4a23c0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:05:22.126865 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:22.126838 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd456e73-c30c-43b2-beba-678dda4a23c0-kube-api-access-z4sjg" (OuterVolumeSpecName: "kube-api-access-z4sjg") pod "fd456e73-c30c-43b2-beba-678dda4a23c0" (UID: "fd456e73-c30c-43b2-beba-678dda4a23c0"). InnerVolumeSpecName "kube-api-access-z4sjg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:05:22.128477 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:22.128456 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd456e73-c30c-43b2-beba-678dda4a23c0-util" (OuterVolumeSpecName: "util") pod "fd456e73-c30c-43b2-beba-678dda4a23c0" (UID: "fd456e73-c30c-43b2-beba-678dda4a23c0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:05:22.224976 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:22.224940 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fd456e73-c30c-43b2-beba-678dda4a23c0-bundle\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:05:22.224976 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:22.224970 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fd456e73-c30c-43b2-beba-678dda4a23c0-util\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:05:22.224976 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:22.224979 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z4sjg\" (UniqueName: \"kubernetes.io/projected/fd456e73-c30c-43b2-beba-678dda4a23c0-kube-api-access-z4sjg\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:05:22.833387 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:22.833350 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm" event={"ID":"fd456e73-c30c-43b2-beba-678dda4a23c0","Type":"ContainerDied","Data":"6be5875aeff2b1db41f779825753d93c6e311624351694a19548e7f428709623"} Apr 22 16:05:22.833387 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:22.833391 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6be5875aeff2b1db41f779825753d93c6e311624351694a19548e7f428709623" Apr 22 16:05:22.833643 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:22.833363 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f57fcm" Apr 22 16:05:45.776535 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.776446 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-7d868c4d86-94h4g"] Apr 22 16:05:45.777079 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.777024 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd456e73-c30c-43b2-beba-678dda4a23c0" containerName="util" Apr 22 16:05:45.777079 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.777043 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd456e73-c30c-43b2-beba-678dda4a23c0" containerName="util" Apr 22 16:05:45.777079 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.777074 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd456e73-c30c-43b2-beba-678dda4a23c0" containerName="extract" Apr 22 16:05:45.777263 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.777082 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd456e73-c30c-43b2-beba-678dda4a23c0" containerName="extract" Apr 22 16:05:45.777263 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.777097 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd456e73-c30c-43b2-beba-678dda4a23c0" containerName="pull" Apr 22 16:05:45.777263 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.777106 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd456e73-c30c-43b2-beba-678dda4a23c0" containerName="pull" Apr 22 16:05:45.777263 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.777248 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd456e73-c30c-43b2-beba-678dda4a23c0" containerName="extract" Apr 22 16:05:45.781253 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.781225 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-94h4g" Apr 22 16:05:45.784915 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.784846 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 16:05:45.784915 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.784863 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-2mt2p\"" Apr 22 16:05:45.784915 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.784883 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 16:05:45.784915 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.784883 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 16:05:45.785232 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.784888 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 16:05:45.785232 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.785043 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 16:05:45.788936 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.788910 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7d868c4d86-94h4g"] Apr 22 16:05:45.797261 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.797215 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/efc86556-664b-4f91-906f-18b172fc9c9c-manager-config\") pod \"lws-controller-manager-7d868c4d86-94h4g\" (UID: \"efc86556-664b-4f91-906f-18b172fc9c9c\") " pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-94h4g" Apr 22 16:05:45.797417 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.797352 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhzzn\" (UniqueName: \"kubernetes.io/projected/efc86556-664b-4f91-906f-18b172fc9c9c-kube-api-access-nhzzn\") pod \"lws-controller-manager-7d868c4d86-94h4g\" (UID: \"efc86556-664b-4f91-906f-18b172fc9c9c\") " pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-94h4g" Apr 22 16:05:45.797417 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.797390 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/efc86556-664b-4f91-906f-18b172fc9c9c-metrics-cert\") pod \"lws-controller-manager-7d868c4d86-94h4g\" (UID: \"efc86556-664b-4f91-906f-18b172fc9c9c\") " pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-94h4g" Apr 22 16:05:45.797417 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.797413 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efc86556-664b-4f91-906f-18b172fc9c9c-cert\") pod \"lws-controller-manager-7d868c4d86-94h4g\" (UID: \"efc86556-664b-4f91-906f-18b172fc9c9c\") " pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-94h4g" Apr 22 16:05:45.898003 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.897948 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhzzn\" (UniqueName: \"kubernetes.io/projected/efc86556-664b-4f91-906f-18b172fc9c9c-kube-api-access-nhzzn\") pod \"lws-controller-manager-7d868c4d86-94h4g\" (UID: \"efc86556-664b-4f91-906f-18b172fc9c9c\") " pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-94h4g" Apr 22 16:05:45.898003 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.898011 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/efc86556-664b-4f91-906f-18b172fc9c9c-metrics-cert\") pod \"lws-controller-manager-7d868c4d86-94h4g\" (UID: \"efc86556-664b-4f91-906f-18b172fc9c9c\") " pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-94h4g" Apr 22 16:05:45.898317 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.898042 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efc86556-664b-4f91-906f-18b172fc9c9c-cert\") pod \"lws-controller-manager-7d868c4d86-94h4g\" (UID: \"efc86556-664b-4f91-906f-18b172fc9c9c\") " pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-94h4g" Apr 22 16:05:45.898317 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.898072 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/efc86556-664b-4f91-906f-18b172fc9c9c-manager-config\") pod \"lws-controller-manager-7d868c4d86-94h4g\" (UID: \"efc86556-664b-4f91-906f-18b172fc9c9c\") " pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-94h4g" Apr 22 16:05:45.898767 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.898746 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/efc86556-664b-4f91-906f-18b172fc9c9c-manager-config\") pod \"lws-controller-manager-7d868c4d86-94h4g\" (UID: \"efc86556-664b-4f91-906f-18b172fc9c9c\") " pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-94h4g" Apr 22 16:05:45.900834 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.900804 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efc86556-664b-4f91-906f-18b172fc9c9c-cert\") pod \"lws-controller-manager-7d868c4d86-94h4g\" (UID: \"efc86556-664b-4f91-906f-18b172fc9c9c\") " pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-94h4g" Apr 22 16:05:45.900927 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.900834 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/efc86556-664b-4f91-906f-18b172fc9c9c-metrics-cert\") pod \"lws-controller-manager-7d868c4d86-94h4g\" (UID: \"efc86556-664b-4f91-906f-18b172fc9c9c\") " pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-94h4g" Apr 22 16:05:45.905829 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:45.905797 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhzzn\" (UniqueName: \"kubernetes.io/projected/efc86556-664b-4f91-906f-18b172fc9c9c-kube-api-access-nhzzn\") pod \"lws-controller-manager-7d868c4d86-94h4g\" (UID: \"efc86556-664b-4f91-906f-18b172fc9c9c\") " pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-94h4g" Apr 22 16:05:46.091376 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:46.091341 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-94h4g" Apr 22 16:05:46.215510 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:46.215475 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7d868c4d86-94h4g"] Apr 22 16:05:46.219552 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:05:46.219523 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefc86556_664b_4f91_906f_18b172fc9c9c.slice/crio-5ca42c857b99b6364ee765e6429ca540896fd0e1a72c1223bb182ab49494e106 WatchSource:0}: Error finding container 5ca42c857b99b6364ee765e6429ca540896fd0e1a72c1223bb182ab49494e106: Status 404 returned error can't find the container with id 5ca42c857b99b6364ee765e6429ca540896fd0e1a72c1223bb182ab49494e106 Apr 22 16:05:46.898431 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:46.898395 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-94h4g" event={"ID":"efc86556-664b-4f91-906f-18b172fc9c9c","Type":"ContainerStarted","Data":"5ca42c857b99b6364ee765e6429ca540896fd0e1a72c1223bb182ab49494e106"} Apr 22 16:05:47.904529 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:47.904474 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-94h4g" event={"ID":"efc86556-664b-4f91-906f-18b172fc9c9c","Type":"ContainerStarted","Data":"171a914c65ad362d276678cbc55a33e86cd98c0ef1ae7ab3993d6815cb80f473"} Apr 22 16:05:47.904972 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:47.904613 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-94h4g" Apr 22 16:05:47.920178 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:47.920127 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-94h4g" podStartSLOduration=1.359680207 podStartE2EDuration="2.920111675s" podCreationTimestamp="2026-04-22 16:05:45 +0000 UTC" firstStartedPulling="2026-04-22 16:05:46.221548907 +0000 UTC m=+427.121102958" lastFinishedPulling="2026-04-22 16:05:47.781980362 +0000 UTC m=+428.681534426" observedRunningTime="2026-04-22 16:05:47.918479564 +0000 UTC m=+428.818033634" watchObservedRunningTime="2026-04-22 16:05:47.920111675 +0000 UTC m=+428.819665747" Apr 22 16:05:51.021631 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:51.021592 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r"] Apr 22 16:05:51.025015 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:51.024999 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r" Apr 22 16:05:51.027667 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:51.027642 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 16:05:51.027667 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:51.027644 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 16:05:51.027839 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:51.027647 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-jvd26\"" Apr 22 16:05:51.039790 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:51.039759 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r"] Apr 22 16:05:51.141017 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:51.140976 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92nrh\" (UniqueName: \"kubernetes.io/projected/90b530b0-371c-47aa-91f1-d1c3946537cb-kube-api-access-92nrh\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r\" (UID: \"90b530b0-371c-47aa-91f1-d1c3946537cb\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r" Apr 22 16:05:51.141188 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:51.141063 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90b530b0-371c-47aa-91f1-d1c3946537cb-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r\" (UID: \"90b530b0-371c-47aa-91f1-d1c3946537cb\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r" Apr 22 16:05:51.141188 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:51.141094 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90b530b0-371c-47aa-91f1-d1c3946537cb-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r\" (UID: \"90b530b0-371c-47aa-91f1-d1c3946537cb\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r" Apr 22 16:05:51.241913 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:51.241870 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90b530b0-371c-47aa-91f1-d1c3946537cb-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r\" (UID: \"90b530b0-371c-47aa-91f1-d1c3946537cb\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r" Apr 22 16:05:51.242081 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:51.241921 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92nrh\" (UniqueName: \"kubernetes.io/projected/90b530b0-371c-47aa-91f1-d1c3946537cb-kube-api-access-92nrh\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r\" (UID: \"90b530b0-371c-47aa-91f1-d1c3946537cb\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r" Apr 22 16:05:51.242081 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:51.241972 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90b530b0-371c-47aa-91f1-d1c3946537cb-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r\" (UID: \"90b530b0-371c-47aa-91f1-d1c3946537cb\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r" Apr 22 16:05:51.242336 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:51.242320 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90b530b0-371c-47aa-91f1-d1c3946537cb-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r\" (UID: \"90b530b0-371c-47aa-91f1-d1c3946537cb\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r" Apr 22 16:05:51.242418 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:51.242393 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90b530b0-371c-47aa-91f1-d1c3946537cb-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r\" (UID: \"90b530b0-371c-47aa-91f1-d1c3946537cb\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r" Apr 22 16:05:51.255654 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:51.255619 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92nrh\" (UniqueName: \"kubernetes.io/projected/90b530b0-371c-47aa-91f1-d1c3946537cb-kube-api-access-92nrh\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r\" (UID: \"90b530b0-371c-47aa-91f1-d1c3946537cb\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r" Apr 22 16:05:51.333621 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:51.333588 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r" Apr 22 16:05:51.453893 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:51.453866 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r"] Apr 22 16:05:51.455953 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:05:51.455924 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90b530b0_371c_47aa_91f1_d1c3946537cb.slice/crio-f86fe344750c43896d9049949b77d133e0676fee1d732c99e94c831052d073d0 WatchSource:0}: Error finding container f86fe344750c43896d9049949b77d133e0676fee1d732c99e94c831052d073d0: Status 404 returned error can't find the container with id f86fe344750c43896d9049949b77d133e0676fee1d732c99e94c831052d073d0 Apr 22 16:05:51.917254 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:51.917215 2572 generic.go:358] "Generic (PLEG): container finished" podID="90b530b0-371c-47aa-91f1-d1c3946537cb" containerID="73afbde35ce7af0b48c93c9cd0965e0dc7534870e780c687312a420a3b3b42e2" exitCode=0 Apr 22 16:05:51.917432 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:51.917272 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r" event={"ID":"90b530b0-371c-47aa-91f1-d1c3946537cb","Type":"ContainerDied","Data":"73afbde35ce7af0b48c93c9cd0965e0dc7534870e780c687312a420a3b3b42e2"} Apr 22 16:05:51.917432 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:51.917301 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r" event={"ID":"90b530b0-371c-47aa-91f1-d1c3946537cb","Type":"ContainerStarted","Data":"f86fe344750c43896d9049949b77d133e0676fee1d732c99e94c831052d073d0"} Apr 22 16:05:52.921756 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:52.921665 2572 generic.go:358] "Generic (PLEG): container finished" podID="90b530b0-371c-47aa-91f1-d1c3946537cb" containerID="7743b4dec3dab7d6074d310abedb33b5699ed3f2efb09b505dc40c189766a860" exitCode=0 Apr 22 16:05:52.921756 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:52.921705 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r" event={"ID":"90b530b0-371c-47aa-91f1-d1c3946537cb","Type":"ContainerDied","Data":"7743b4dec3dab7d6074d310abedb33b5699ed3f2efb09b505dc40c189766a860"} Apr 22 16:05:53.005683 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:53.005635 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-54dfb4598d-7pws5"] Apr 22 16:05:53.009332 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:53.009301 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-7pws5" Apr 22 16:05:53.011672 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:53.011646 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 22 16:05:53.011858 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:53.011675 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 22 16:05:53.011858 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:53.011737 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-pj4nw\"" Apr 22 16:05:53.011983 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:53.011930 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 22 16:05:53.012480 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:53.012459 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 22 16:05:53.021618 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:53.021592 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-54dfb4598d-7pws5"] Apr 22 16:05:53.057408 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:53.057373 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9d0807db-368f-4d23-a54f-aba01a637eef-apiservice-cert\") pod \"opendatahub-operator-controller-manager-54dfb4598d-7pws5\" (UID: \"9d0807db-368f-4d23-a54f-aba01a637eef\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-7pws5" Apr 22 16:05:53.057622 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:53.057428 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfnfh\" (UniqueName: \"kubernetes.io/projected/9d0807db-368f-4d23-a54f-aba01a637eef-kube-api-access-qfnfh\") pod \"opendatahub-operator-controller-manager-54dfb4598d-7pws5\" (UID: \"9d0807db-368f-4d23-a54f-aba01a637eef\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-7pws5" Apr 22 16:05:53.057622 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:53.057530 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9d0807db-368f-4d23-a54f-aba01a637eef-webhook-cert\") pod \"opendatahub-operator-controller-manager-54dfb4598d-7pws5\" (UID: \"9d0807db-368f-4d23-a54f-aba01a637eef\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-7pws5" Apr 22 16:05:53.158111 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:53.158070 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfnfh\" (UniqueName: \"kubernetes.io/projected/9d0807db-368f-4d23-a54f-aba01a637eef-kube-api-access-qfnfh\") pod \"opendatahub-operator-controller-manager-54dfb4598d-7pws5\" (UID: \"9d0807db-368f-4d23-a54f-aba01a637eef\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-7pws5" Apr 22 16:05:53.158316 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:53.158138 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9d0807db-368f-4d23-a54f-aba01a637eef-webhook-cert\") pod \"opendatahub-operator-controller-manager-54dfb4598d-7pws5\" (UID: \"9d0807db-368f-4d23-a54f-aba01a637eef\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-7pws5" Apr 22 16:05:53.158316 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:53.158183 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9d0807db-368f-4d23-a54f-aba01a637eef-apiservice-cert\") pod \"opendatahub-operator-controller-manager-54dfb4598d-7pws5\" (UID: \"9d0807db-368f-4d23-a54f-aba01a637eef\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-7pws5" Apr 22 16:05:53.160930 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:53.160893 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9d0807db-368f-4d23-a54f-aba01a637eef-apiservice-cert\") pod \"opendatahub-operator-controller-manager-54dfb4598d-7pws5\" (UID: \"9d0807db-368f-4d23-a54f-aba01a637eef\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-7pws5" Apr 22 16:05:53.160930 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:53.160928 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9d0807db-368f-4d23-a54f-aba01a637eef-webhook-cert\") pod \"opendatahub-operator-controller-manager-54dfb4598d-7pws5\" (UID: \"9d0807db-368f-4d23-a54f-aba01a637eef\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-7pws5" Apr 22 16:05:53.169663 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:53.169643 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfnfh\" (UniqueName: \"kubernetes.io/projected/9d0807db-368f-4d23-a54f-aba01a637eef-kube-api-access-qfnfh\") pod \"opendatahub-operator-controller-manager-54dfb4598d-7pws5\" (UID: \"9d0807db-368f-4d23-a54f-aba01a637eef\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-7pws5" Apr 22 16:05:53.320695 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:53.320604 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-7pws5" Apr 22 16:05:53.448462 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:53.448430 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-54dfb4598d-7pws5"] Apr 22 16:05:53.451880 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:05:53.451849 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d0807db_368f_4d23_a54f_aba01a637eef.slice/crio-e854858247c9614a3ee61be9591b5b87e10ffb991f3a875a05971277dff23b66 WatchSource:0}: Error finding container e854858247c9614a3ee61be9591b5b87e10ffb991f3a875a05971277dff23b66: Status 404 returned error can't find the container with id e854858247c9614a3ee61be9591b5b87e10ffb991f3a875a05971277dff23b66 Apr 22 16:05:53.926929 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:53.926896 2572 generic.go:358] "Generic (PLEG): container finished" podID="90b530b0-371c-47aa-91f1-d1c3946537cb" containerID="70c5ab86a8427051529792ba8696214f9f31735aa814cc0278468135f45ea85b" exitCode=0 Apr 22 16:05:53.927395 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:53.926964 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r" event={"ID":"90b530b0-371c-47aa-91f1-d1c3946537cb","Type":"ContainerDied","Data":"70c5ab86a8427051529792ba8696214f9f31735aa814cc0278468135f45ea85b"} Apr 22 16:05:53.928042 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:53.928022 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-7pws5" event={"ID":"9d0807db-368f-4d23-a54f-aba01a637eef","Type":"ContainerStarted","Data":"e854858247c9614a3ee61be9591b5b87e10ffb991f3a875a05971277dff23b66"} Apr 22 16:05:56.169849 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:56.169823 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r" Apr 22 16:05:56.283025 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:56.282901 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92nrh\" (UniqueName: \"kubernetes.io/projected/90b530b0-371c-47aa-91f1-d1c3946537cb-kube-api-access-92nrh\") pod \"90b530b0-371c-47aa-91f1-d1c3946537cb\" (UID: \"90b530b0-371c-47aa-91f1-d1c3946537cb\") " Apr 22 16:05:56.283025 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:56.282989 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90b530b0-371c-47aa-91f1-d1c3946537cb-bundle\") pod \"90b530b0-371c-47aa-91f1-d1c3946537cb\" (UID: \"90b530b0-371c-47aa-91f1-d1c3946537cb\") " Apr 22 16:05:56.283025 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:56.283018 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90b530b0-371c-47aa-91f1-d1c3946537cb-util\") pod \"90b530b0-371c-47aa-91f1-d1c3946537cb\" (UID: \"90b530b0-371c-47aa-91f1-d1c3946537cb\") " Apr 22 16:05:56.283949 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:56.283915 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90b530b0-371c-47aa-91f1-d1c3946537cb-bundle" (OuterVolumeSpecName: "bundle") pod "90b530b0-371c-47aa-91f1-d1c3946537cb" (UID: "90b530b0-371c-47aa-91f1-d1c3946537cb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:05:56.285481 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:56.285447 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90b530b0-371c-47aa-91f1-d1c3946537cb-kube-api-access-92nrh" (OuterVolumeSpecName: "kube-api-access-92nrh") pod "90b530b0-371c-47aa-91f1-d1c3946537cb" (UID: "90b530b0-371c-47aa-91f1-d1c3946537cb"). InnerVolumeSpecName "kube-api-access-92nrh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:05:56.289846 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:56.289640 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90b530b0-371c-47aa-91f1-d1c3946537cb-util" (OuterVolumeSpecName: "util") pod "90b530b0-371c-47aa-91f1-d1c3946537cb" (UID: "90b530b0-371c-47aa-91f1-d1c3946537cb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:05:56.384477 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:56.384435 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-92nrh\" (UniqueName: \"kubernetes.io/projected/90b530b0-371c-47aa-91f1-d1c3946537cb-kube-api-access-92nrh\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:05:56.384477 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:56.384477 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90b530b0-371c-47aa-91f1-d1c3946537cb-bundle\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:05:56.384733 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:56.384493 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90b530b0-371c-47aa-91f1-d1c3946537cb-util\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:05:56.941982 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:56.941948 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r" event={"ID":"90b530b0-371c-47aa-91f1-d1c3946537cb","Type":"ContainerDied","Data":"f86fe344750c43896d9049949b77d133e0676fee1d732c99e94c831052d073d0"} Apr 22 16:05:56.941982 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:56.941979 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f86fe344750c43896d9049949b77d133e0676fee1d732c99e94c831052d073d0" Apr 22 16:05:56.941982 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:56.941993 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9nj56r" Apr 22 16:05:56.943489 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:56.943459 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-7pws5" event={"ID":"9d0807db-368f-4d23-a54f-aba01a637eef","Type":"ContainerStarted","Data":"e4584938ad8674e925411053ae6c90dcca9d2ea88459425483295ce20b0a8ee0"} Apr 22 16:05:56.943612 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:56.943594 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-7pws5" Apr 22 16:05:57.216947 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:57.216835 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-7pws5" podStartSLOduration=2.488390349 podStartE2EDuration="5.216815274s" podCreationTimestamp="2026-04-22 16:05:52 +0000 UTC" firstStartedPulling="2026-04-22 16:05:53.454273845 +0000 UTC m=+434.353827897" lastFinishedPulling="2026-04-22 16:05:56.18269877 +0000 UTC m=+437.082252822" observedRunningTime="2026-04-22 16:05:56.961505692 +0000 UTC m=+437.861059762" watchObservedRunningTime="2026-04-22 16:05:57.216815274 +0000 UTC m=+438.116369344" Apr 22 16:05:58.909622 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:05:58.909592 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-94h4g" Apr 22 16:06:07.949261 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:07.949231 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-7pws5" Apr 22 16:06:22.210079 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:22.210037 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68f4f8fc6-jkg98"] Apr 22 16:06:46.478837 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.478791 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk"] Apr 22 16:06:46.479345 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.479100 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90b530b0-371c-47aa-91f1-d1c3946537cb" containerName="util" Apr 22 16:06:46.479345 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.479114 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b530b0-371c-47aa-91f1-d1c3946537cb" containerName="util" Apr 22 16:06:46.479345 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.479123 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90b530b0-371c-47aa-91f1-d1c3946537cb" containerName="pull" Apr 22 16:06:46.479345 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.479128 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b530b0-371c-47aa-91f1-d1c3946537cb" containerName="pull" Apr 22 16:06:46.479345 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.479133 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90b530b0-371c-47aa-91f1-d1c3946537cb" containerName="extract" Apr 22 16:06:46.479345 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.479139 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b530b0-371c-47aa-91f1-d1c3946537cb" containerName="extract" Apr 22 16:06:46.479345 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.479232 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="90b530b0-371c-47aa-91f1-d1c3946537cb" containerName="extract" Apr 22 16:06:46.488275 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.488253 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.490926 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.490890 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 16:06:46.491069 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.490899 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 22 16:06:46.491407 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.491350 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 16:06:46.491407 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.491350 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-lmgp6\"" Apr 22 16:06:46.492516 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.492491 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk"] Apr 22 16:06:46.667712 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.667674 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.667905 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.667720 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/70d62a85-d84b-4895-a70d-6e634844f5bd-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.667905 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.667745 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.667905 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.667795 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkwkr\" (UniqueName: \"kubernetes.io/projected/70d62a85-d84b-4895-a70d-6e634844f5bd-kube-api-access-zkwkr\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.668029 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.667911 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.668029 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.667940 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.668029 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.667971 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/70d62a85-d84b-4895-a70d-6e634844f5bd-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.668029 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.668015 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.668171 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.668031 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/70d62a85-d84b-4895-a70d-6e634844f5bd-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.752790 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.752711 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4"] Apr 22 16:06:46.754856 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.754840 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.766117 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.766093 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4"] Apr 22 16:06:46.768419 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.768393 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.768535 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.768430 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/70d62a85-d84b-4895-a70d-6e634844f5bd-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.768535 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.768464 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.768535 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.768489 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zkwkr\" (UniqueName: \"kubernetes.io/projected/70d62a85-d84b-4895-a70d-6e634844f5bd-kube-api-access-zkwkr\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.768535 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.768530 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.768756 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.768556 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.768756 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.768591 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/70d62a85-d84b-4895-a70d-6e634844f5bd-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.768756 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.768649 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.768756 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.768674 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/70d62a85-d84b-4895-a70d-6e634844f5bd-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.768973 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.768796 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.769132 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.769107 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.769527 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.769231 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.769527 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.769364 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.769527 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.769491 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/70d62a85-d84b-4895-a70d-6e634844f5bd-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.771073 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.771050 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/70d62a85-d84b-4895-a70d-6e634844f5bd-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.771354 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.771334 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.779870 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.779846 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/70d62a85-d84b-4895-a70d-6e634844f5bd-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.779957 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.779916 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkwkr\" (UniqueName: \"kubernetes.io/projected/70d62a85-d84b-4895-a70d-6e634844f5bd-kube-api-access-zkwkr\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd578btk\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.801725 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.801699 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:46.870040 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.869989 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.870181 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.870102 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.870181 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.870143 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.870340 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.870233 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.870340 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.870300 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7lph\" (UniqueName: \"kubernetes.io/projected/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-kube-api-access-c7lph\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.870471 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.870345 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.870471 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.870398 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.870471 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.870431 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.870471 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.870463 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.929890 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.929862 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk"] Apr 22 16:06:46.932645 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:06:46.932619 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70d62a85_d84b_4895_a70d_6e634844f5bd.slice/crio-4ffeb023900007944f708e614508a1dd2a8487f3481aea5ce55a2d4fcecffad6 WatchSource:0}: Error finding container 4ffeb023900007944f708e614508a1dd2a8487f3481aea5ce55a2d4fcecffad6: Status 404 returned error can't find the container with id 4ffeb023900007944f708e614508a1dd2a8487f3481aea5ce55a2d4fcecffad6 Apr 22 16:06:46.971415 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.971385 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.971415 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.971424 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.971658 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.971447 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.971658 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.971475 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.971658 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.971512 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.971658 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.971535 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.971658 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.971580 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.971658 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.971619 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7lph\" (UniqueName: \"kubernetes.io/projected/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-kube-api-access-c7lph\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.971658 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.971647 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.971987 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.971836 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.971987 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.971866 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.972152 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.972130 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.972239 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.972218 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.972340 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.972323 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.973933 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.973918 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.974145 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.974127 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.978968 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.978938 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:46.979066 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:46.979043 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7lph\" (UniqueName: \"kubernetes.io/projected/e23cc38c-1adb-4c8a-ae0a-759d22d95fcf-kube-api-access-c7lph\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4\" (UID: \"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:47.064451 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.064369 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:47.100217 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.100160 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" event={"ID":"70d62a85-d84b-4895-a70d-6e634844f5bd","Type":"ContainerStarted","Data":"4ffeb023900007944f708e614508a1dd2a8487f3481aea5ce55a2d4fcecffad6"} Apr 22 16:06:47.191558 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.191410 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4"] Apr 22 16:06:47.194629 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:06:47.194598 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode23cc38c_1adb_4c8a_ae0a_759d22d95fcf.slice/crio-bc73857d9f465a2f2b29053d66b49d8d2e1e7814d16a12c60197808114da894e WatchSource:0}: Error finding container bc73857d9f465a2f2b29053d66b49d8d2e1e7814d16a12c60197808114da894e: Status 404 returned error can't find the container with id bc73857d9f465a2f2b29053d66b49d8d2e1e7814d16a12c60197808114da894e Apr 22 16:06:47.229957 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.229895 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-68f4f8fc6-jkg98" podUID="24164a60-dfe4-4424-a7dd-ce0172e05f3a" containerName="console" containerID="cri-o://aea7be47de80212d6f511dfbbae3e5119af2977e917d51e6585b5bbf3d4deab3" gracePeriod=15 Apr 22 16:06:47.460536 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.460515 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68f4f8fc6-jkg98_24164a60-dfe4-4424-a7dd-ce0172e05f3a/console/0.log" Apr 22 16:06:47.460675 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.460580 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:06:47.577457 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.577398 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24164a60-dfe4-4424-a7dd-ce0172e05f3a-trusted-ca-bundle\") pod \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " Apr 22 16:06:47.577457 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.577465 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzn29\" (UniqueName: \"kubernetes.io/projected/24164a60-dfe4-4424-a7dd-ce0172e05f3a-kube-api-access-xzn29\") pod \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " Apr 22 16:06:47.577983 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.577483 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/24164a60-dfe4-4424-a7dd-ce0172e05f3a-console-serving-cert\") pod \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " Apr 22 16:06:47.577983 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.577530 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/24164a60-dfe4-4424-a7dd-ce0172e05f3a-console-oauth-config\") pod \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " Apr 22 16:06:47.577983 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.577547 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/24164a60-dfe4-4424-a7dd-ce0172e05f3a-console-config\") pod \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " Apr 22 16:06:47.577983 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.577567 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24164a60-dfe4-4424-a7dd-ce0172e05f3a-service-ca\") pod \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " Apr 22 16:06:47.577983 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.577668 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/24164a60-dfe4-4424-a7dd-ce0172e05f3a-oauth-serving-cert\") pod \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\" (UID: \"24164a60-dfe4-4424-a7dd-ce0172e05f3a\") " Apr 22 16:06:47.578238 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.577988 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24164a60-dfe4-4424-a7dd-ce0172e05f3a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "24164a60-dfe4-4424-a7dd-ce0172e05f3a" (UID: "24164a60-dfe4-4424-a7dd-ce0172e05f3a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:06:47.578238 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.578122 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24164a60-dfe4-4424-a7dd-ce0172e05f3a-console-config" (OuterVolumeSpecName: "console-config") pod "24164a60-dfe4-4424-a7dd-ce0172e05f3a" (UID: "24164a60-dfe4-4424-a7dd-ce0172e05f3a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:06:47.578238 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.578127 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24164a60-dfe4-4424-a7dd-ce0172e05f3a-service-ca" (OuterVolumeSpecName: "service-ca") pod "24164a60-dfe4-4424-a7dd-ce0172e05f3a" (UID: "24164a60-dfe4-4424-a7dd-ce0172e05f3a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:06:47.578416 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.578295 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24164a60-dfe4-4424-a7dd-ce0172e05f3a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "24164a60-dfe4-4424-a7dd-ce0172e05f3a" (UID: "24164a60-dfe4-4424-a7dd-ce0172e05f3a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:06:47.579950 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.579930 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24164a60-dfe4-4424-a7dd-ce0172e05f3a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "24164a60-dfe4-4424-a7dd-ce0172e05f3a" (UID: "24164a60-dfe4-4424-a7dd-ce0172e05f3a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:06:47.580012 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.579973 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24164a60-dfe4-4424-a7dd-ce0172e05f3a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "24164a60-dfe4-4424-a7dd-ce0172e05f3a" (UID: "24164a60-dfe4-4424-a7dd-ce0172e05f3a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:06:47.580047 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.580029 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24164a60-dfe4-4424-a7dd-ce0172e05f3a-kube-api-access-xzn29" (OuterVolumeSpecName: "kube-api-access-xzn29") pod "24164a60-dfe4-4424-a7dd-ce0172e05f3a" (UID: "24164a60-dfe4-4424-a7dd-ce0172e05f3a"). InnerVolumeSpecName "kube-api-access-xzn29". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:06:47.678463 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.678434 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/24164a60-dfe4-4424-a7dd-ce0172e05f3a-console-oauth-config\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:06:47.678463 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.678459 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/24164a60-dfe4-4424-a7dd-ce0172e05f3a-console-config\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:06:47.678463 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.678469 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24164a60-dfe4-4424-a7dd-ce0172e05f3a-service-ca\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:06:47.678687 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.678477 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/24164a60-dfe4-4424-a7dd-ce0172e05f3a-oauth-serving-cert\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:06:47.678687 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.678486 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24164a60-dfe4-4424-a7dd-ce0172e05f3a-trusted-ca-bundle\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:06:47.678687 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.678495 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xzn29\" (UniqueName: \"kubernetes.io/projected/24164a60-dfe4-4424-a7dd-ce0172e05f3a-kube-api-access-xzn29\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:06:47.678687 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:47.678504 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/24164a60-dfe4-4424-a7dd-ce0172e05f3a-console-serving-cert\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:06:48.107658 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:48.106260 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68f4f8fc6-jkg98_24164a60-dfe4-4424-a7dd-ce0172e05f3a/console/0.log" Apr 22 16:06:48.107658 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:48.106315 2572 generic.go:358] "Generic (PLEG): container finished" podID="24164a60-dfe4-4424-a7dd-ce0172e05f3a" containerID="aea7be47de80212d6f511dfbbae3e5119af2977e917d51e6585b5bbf3d4deab3" exitCode=2 Apr 22 16:06:48.107658 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:48.106421 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68f4f8fc6-jkg98" event={"ID":"24164a60-dfe4-4424-a7dd-ce0172e05f3a","Type":"ContainerDied","Data":"aea7be47de80212d6f511dfbbae3e5119af2977e917d51e6585b5bbf3d4deab3"} Apr 22 16:06:48.107658 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:48.106450 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68f4f8fc6-jkg98" event={"ID":"24164a60-dfe4-4424-a7dd-ce0172e05f3a","Type":"ContainerDied","Data":"7dac0f9a8a830650099ba3e84a06aa200f3c3b00d4fd6f54e9920f3d1cb2c40a"} Apr 22 16:06:48.107658 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:48.106470 2572 scope.go:117] "RemoveContainer" containerID="aea7be47de80212d6f511dfbbae3e5119af2977e917d51e6585b5bbf3d4deab3" Apr 22 16:06:48.107658 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:48.106625 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68f4f8fc6-jkg98" Apr 22 16:06:48.111402 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:48.110741 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" event={"ID":"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf","Type":"ContainerStarted","Data":"bc73857d9f465a2f2b29053d66b49d8d2e1e7814d16a12c60197808114da894e"} Apr 22 16:06:48.126087 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:48.126051 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68f4f8fc6-jkg98"] Apr 22 16:06:48.130249 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:48.130215 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-68f4f8fc6-jkg98"] Apr 22 16:06:48.169452 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:48.169428 2572 scope.go:117] "RemoveContainer" containerID="aea7be47de80212d6f511dfbbae3e5119af2977e917d51e6585b5bbf3d4deab3" Apr 22 16:06:48.169886 ip-10-0-135-9 kubenswrapper[2572]: E0422 16:06:48.169854 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aea7be47de80212d6f511dfbbae3e5119af2977e917d51e6585b5bbf3d4deab3\": container with ID starting with aea7be47de80212d6f511dfbbae3e5119af2977e917d51e6585b5bbf3d4deab3 not found: ID does not exist" containerID="aea7be47de80212d6f511dfbbae3e5119af2977e917d51e6585b5bbf3d4deab3" Apr 22 16:06:48.169991 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:48.169892 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aea7be47de80212d6f511dfbbae3e5119af2977e917d51e6585b5bbf3d4deab3"} err="failed to get container status \"aea7be47de80212d6f511dfbbae3e5119af2977e917d51e6585b5bbf3d4deab3\": rpc error: code = NotFound desc = could not find container \"aea7be47de80212d6f511dfbbae3e5119af2977e917d51e6585b5bbf3d4deab3\": container with ID starting with aea7be47de80212d6f511dfbbae3e5119af2977e917d51e6585b5bbf3d4deab3 not found: ID does not exist" Apr 22 16:06:49.567515 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:49.567470 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 22 16:06:49.567845 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:49.567561 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 22 16:06:49.567845 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:49.567602 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 22 16:06:49.573011 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:49.572984 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 22 16:06:49.573121 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:49.573074 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 22 16:06:49.573121 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:49.573111 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 22 16:06:49.645462 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:49.645428 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24164a60-dfe4-4424-a7dd-ce0172e05f3a" path="/var/lib/kubelet/pods/24164a60-dfe4-4424-a7dd-ce0172e05f3a/volumes" Apr 22 16:06:50.125053 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:50.125014 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" event={"ID":"70d62a85-d84b-4895-a70d-6e634844f5bd","Type":"ContainerStarted","Data":"c52d419a6668e31f1aa83fe25404fbd0f8bb2789321347dbc6a9a64be56a424a"} Apr 22 16:06:50.126402 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:50.126321 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" event={"ID":"e23cc38c-1adb-4c8a-ae0a-759d22d95fcf","Type":"ContainerStarted","Data":"75c86896851e72b09ecc810b0d35b1638115679dbb37b734d4b9090b6b2f316f"} Apr 22 16:06:50.144726 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:50.144676 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" podStartSLOduration=1.512574253 podStartE2EDuration="4.14466296s" podCreationTimestamp="2026-04-22 16:06:46 +0000 UTC" firstStartedPulling="2026-04-22 16:06:46.935063114 +0000 UTC m=+487.834617164" lastFinishedPulling="2026-04-22 16:06:49.567151805 +0000 UTC m=+490.466705871" observedRunningTime="2026-04-22 16:06:50.142726809 +0000 UTC m=+491.042280879" watchObservedRunningTime="2026-04-22 16:06:50.14466296 +0000 UTC m=+491.044217030" Apr 22 16:06:50.159900 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:50.159850 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" podStartSLOduration=1.783651527 podStartE2EDuration="4.159830154s" podCreationTimestamp="2026-04-22 16:06:46 +0000 UTC" firstStartedPulling="2026-04-22 16:06:47.196584204 +0000 UTC m=+488.096138257" lastFinishedPulling="2026-04-22 16:06:49.572762823 +0000 UTC m=+490.472316884" observedRunningTime="2026-04-22 16:06:50.158863938 +0000 UTC m=+491.058418009" watchObservedRunningTime="2026-04-22 16:06:50.159830154 +0000 UTC m=+491.059384224" Apr 22 16:06:50.802873 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:50.802835 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:50.804058 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:50.804031 2572 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.23:15021/healthz/ready\": dial tcp 10.133.0.23:15021: connect: connection refused" start-of-body= Apr 22 16:06:50.804174 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:50.804082 2572 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" podUID="70d62a85-d84b-4895-a70d-6e634844f5bd" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.23:15021/healthz/ready\": dial tcp 10.133.0.23:15021: connect: connection refused" Apr 22 16:06:51.065623 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:51.065528 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:51.070353 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:51.070323 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:51.130574 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:51.130540 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:51.131409 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:51.131392 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4" Apr 22 16:06:51.175526 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:51.175490 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk"] Apr 22 16:06:51.803088 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:51.803055 2572 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.23:15021/healthz/ready\": dial tcp 10.133.0.23:15021: connect: connection refused" start-of-body= Apr 22 16:06:51.803493 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:51.803112 2572 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" podUID="70d62a85-d84b-4895-a70d-6e634844f5bd" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.23:15021/healthz/ready\": dial tcp 10.133.0.23:15021: connect: connection refused" Apr 22 16:06:52.803138 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:52.803106 2572 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.23:15021/healthz/ready\": dial tcp 10.133.0.23:15021: connect: connection refused" start-of-body= Apr 22 16:06:52.803612 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:52.803158 2572 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" podUID="70d62a85-d84b-4895-a70d-6e634844f5bd" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.23:15021/healthz/ready\": dial tcp 10.133.0.23:15021: connect: connection refused" Apr 22 16:06:53.136766 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:53.136714 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" podUID="70d62a85-d84b-4895-a70d-6e634844f5bd" containerName="istio-proxy" containerID="cri-o://c52d419a6668e31f1aa83fe25404fbd0f8bb2789321347dbc6a9a64be56a424a" gracePeriod=30 Apr 22 16:06:58.374120 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.374095 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:58.467247 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.467121 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/70d62a85-d84b-4895-a70d-6e634844f5bd-istiod-ca-cert\") pod \"70d62a85-d84b-4895-a70d-6e634844f5bd\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " Apr 22 16:06:58.467247 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.467174 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-workload-socket\") pod \"70d62a85-d84b-4895-a70d-6e634844f5bd\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " Apr 22 16:06:58.467247 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.467234 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkwkr\" (UniqueName: \"kubernetes.io/projected/70d62a85-d84b-4895-a70d-6e634844f5bd-kube-api-access-zkwkr\") pod \"70d62a85-d84b-4895-a70d-6e634844f5bd\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " Apr 22 16:06:58.467509 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.467258 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/70d62a85-d84b-4895-a70d-6e634844f5bd-istio-podinfo\") pod \"70d62a85-d84b-4895-a70d-6e634844f5bd\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " Apr 22 16:06:58.467509 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.467279 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-workload-certs\") pod \"70d62a85-d84b-4895-a70d-6e634844f5bd\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " Apr 22 16:06:58.467509 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.467308 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-credential-socket\") pod \"70d62a85-d84b-4895-a70d-6e634844f5bd\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " Apr 22 16:06:58.467509 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.467420 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-istio-data\") pod \"70d62a85-d84b-4895-a70d-6e634844f5bd\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " Apr 22 16:06:58.467509 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.467476 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-istio-envoy\") pod \"70d62a85-d84b-4895-a70d-6e634844f5bd\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " Apr 22 16:06:58.467754 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.467526 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/70d62a85-d84b-4895-a70d-6e634844f5bd-istio-token\") pod \"70d62a85-d84b-4895-a70d-6e634844f5bd\" (UID: \"70d62a85-d84b-4895-a70d-6e634844f5bd\") " Apr 22 16:06:58.467754 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.467525 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70d62a85-d84b-4895-a70d-6e634844f5bd-istiod-ca-cert" (OuterVolumeSpecName: "istiod-ca-cert") pod "70d62a85-d84b-4895-a70d-6e634844f5bd" (UID: "70d62a85-d84b-4895-a70d-6e634844f5bd"). InnerVolumeSpecName "istiod-ca-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:06:58.467754 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.467616 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-workload-certs" (OuterVolumeSpecName: "workload-certs") pod "70d62a85-d84b-4895-a70d-6e634844f5bd" (UID: "70d62a85-d84b-4895-a70d-6e634844f5bd"). InnerVolumeSpecName "workload-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:06:58.467754 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.467623 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-workload-socket" (OuterVolumeSpecName: "workload-socket") pod "70d62a85-d84b-4895-a70d-6e634844f5bd" (UID: "70d62a85-d84b-4895-a70d-6e634844f5bd"). InnerVolumeSpecName "workload-socket". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:06:58.467754 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.467713 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-istio-data" (OuterVolumeSpecName: "istio-data") pod "70d62a85-d84b-4895-a70d-6e634844f5bd" (UID: "70d62a85-d84b-4895-a70d-6e634844f5bd"). InnerVolumeSpecName "istio-data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:06:58.467984 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.467845 2572 reconciler_common.go:299] "Volume detached for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-workload-certs\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:06:58.467984 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.467865 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-credential-socket" (OuterVolumeSpecName: "credential-socket") pod "70d62a85-d84b-4895-a70d-6e634844f5bd" (UID: "70d62a85-d84b-4895-a70d-6e634844f5bd"). InnerVolumeSpecName "credential-socket". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:06:58.467984 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.467874 2572 reconciler_common.go:299] "Volume detached for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-istio-data\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:06:58.467984 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.467888 2572 reconciler_common.go:299] "Volume detached for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/70d62a85-d84b-4895-a70d-6e634844f5bd-istiod-ca-cert\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:06:58.467984 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.467901 2572 reconciler_common.go:299] "Volume detached for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-workload-socket\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:06:58.469938 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.469909 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70d62a85-d84b-4895-a70d-6e634844f5bd-istio-token" (OuterVolumeSpecName: "istio-token") pod "70d62a85-d84b-4895-a70d-6e634844f5bd" (UID: "70d62a85-d84b-4895-a70d-6e634844f5bd"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:06:58.469938 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.469908 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70d62a85-d84b-4895-a70d-6e634844f5bd-kube-api-access-zkwkr" (OuterVolumeSpecName: "kube-api-access-zkwkr") pod "70d62a85-d84b-4895-a70d-6e634844f5bd" (UID: "70d62a85-d84b-4895-a70d-6e634844f5bd"). InnerVolumeSpecName "kube-api-access-zkwkr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:06:58.470160 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.470145 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/70d62a85-d84b-4895-a70d-6e634844f5bd-istio-podinfo" (OuterVolumeSpecName: "istio-podinfo") pod "70d62a85-d84b-4895-a70d-6e634844f5bd" (UID: "70d62a85-d84b-4895-a70d-6e634844f5bd"). InnerVolumeSpecName "istio-podinfo". PluginName "kubernetes.io/downward-api", VolumeGIDValue "" Apr 22 16:06:58.470251 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.470161 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-istio-envoy" (OuterVolumeSpecName: "istio-envoy") pod "70d62a85-d84b-4895-a70d-6e634844f5bd" (UID: "70d62a85-d84b-4895-a70d-6e634844f5bd"). InnerVolumeSpecName "istio-envoy". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:06:58.568541 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.568502 2572 reconciler_common.go:299] "Volume detached for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-istio-envoy\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:06:58.568541 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.568535 2572 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/70d62a85-d84b-4895-a70d-6e634844f5bd-istio-token\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:06:58.568541 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.568548 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zkwkr\" (UniqueName: \"kubernetes.io/projected/70d62a85-d84b-4895-a70d-6e634844f5bd-kube-api-access-zkwkr\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:06:58.568787 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.568561 2572 reconciler_common.go:299] "Volume detached for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/70d62a85-d84b-4895-a70d-6e634844f5bd-istio-podinfo\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:06:58.568787 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:58.568573 2572 reconciler_common.go:299] "Volume detached for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/70d62a85-d84b-4895-a70d-6e634844f5bd-credential-socket\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:06:59.155788 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:59.155755 2572 generic.go:358] "Generic (PLEG): container finished" podID="70d62a85-d84b-4895-a70d-6e634844f5bd" containerID="c52d419a6668e31f1aa83fe25404fbd0f8bb2789321347dbc6a9a64be56a424a" exitCode=0 Apr 22 16:06:59.155973 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:59.155818 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" Apr 22 16:06:59.155973 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:59.155844 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" event={"ID":"70d62a85-d84b-4895-a70d-6e634844f5bd","Type":"ContainerDied","Data":"c52d419a6668e31f1aa83fe25404fbd0f8bb2789321347dbc6a9a64be56a424a"} Apr 22 16:06:59.155973 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:59.155880 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk" event={"ID":"70d62a85-d84b-4895-a70d-6e634844f5bd","Type":"ContainerDied","Data":"4ffeb023900007944f708e614508a1dd2a8487f3481aea5ce55a2d4fcecffad6"} Apr 22 16:06:59.155973 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:59.155895 2572 scope.go:117] "RemoveContainer" containerID="c52d419a6668e31f1aa83fe25404fbd0f8bb2789321347dbc6a9a64be56a424a" Apr 22 16:06:59.165030 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:59.165013 2572 scope.go:117] "RemoveContainer" containerID="c52d419a6668e31f1aa83fe25404fbd0f8bb2789321347dbc6a9a64be56a424a" Apr 22 16:06:59.165316 ip-10-0-135-9 kubenswrapper[2572]: E0422 16:06:59.165297 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c52d419a6668e31f1aa83fe25404fbd0f8bb2789321347dbc6a9a64be56a424a\": container with ID starting with c52d419a6668e31f1aa83fe25404fbd0f8bb2789321347dbc6a9a64be56a424a not found: ID does not exist" containerID="c52d419a6668e31f1aa83fe25404fbd0f8bb2789321347dbc6a9a64be56a424a" Apr 22 16:06:59.165366 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:59.165325 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c52d419a6668e31f1aa83fe25404fbd0f8bb2789321347dbc6a9a64be56a424a"} err="failed to get container status \"c52d419a6668e31f1aa83fe25404fbd0f8bb2789321347dbc6a9a64be56a424a\": rpc error: code = NotFound desc = could not find container \"c52d419a6668e31f1aa83fe25404fbd0f8bb2789321347dbc6a9a64be56a424a\": container with ID starting with c52d419a6668e31f1aa83fe25404fbd0f8bb2789321347dbc6a9a64be56a424a not found: ID does not exist" Apr 22 16:06:59.381141 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:59.381101 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk"] Apr 22 16:06:59.386830 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:59.386798 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd578btk"] Apr 22 16:06:59.648407 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:06:59.648374 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70d62a85-d84b-4895-a70d-6e634844f5bd" path="/var/lib/kubelet/pods/70d62a85-d84b-4895-a70d-6e634844f5bd/volumes" Apr 22 16:07:10.901128 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:10.901088 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-kf7rb"] Apr 22 16:07:10.901527 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:10.901390 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24164a60-dfe4-4424-a7dd-ce0172e05f3a" containerName="console" Apr 22 16:07:10.901527 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:10.901402 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="24164a60-dfe4-4424-a7dd-ce0172e05f3a" containerName="console" Apr 22 16:07:10.901527 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:10.901423 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70d62a85-d84b-4895-a70d-6e634844f5bd" containerName="istio-proxy" Apr 22 16:07:10.901527 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:10.901428 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="70d62a85-d84b-4895-a70d-6e634844f5bd" containerName="istio-proxy" Apr 22 16:07:10.901527 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:10.901473 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="70d62a85-d84b-4895-a70d-6e634844f5bd" containerName="istio-proxy" Apr 22 16:07:10.901527 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:10.901483 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="24164a60-dfe4-4424-a7dd-ce0172e05f3a" containerName="console" Apr 22 16:07:10.903970 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:10.903948 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-kf7rb" Apr 22 16:07:10.906618 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:10.906587 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 16:07:10.906618 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:10.906594 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 16:07:10.906805 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:10.906674 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-p42cq\"" Apr 22 16:07:10.914850 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:10.914766 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-kf7rb"] Apr 22 16:07:10.969648 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:10.969610 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxfdk\" (UniqueName: \"kubernetes.io/projected/e1115486-9ff4-4d75-bf4d-4a7216c6d270-kube-api-access-hxfdk\") pod \"kuadrant-operator-catalog-kf7rb\" (UID: \"e1115486-9ff4-4d75-bf4d-4a7216c6d270\") " pod="kuadrant-system/kuadrant-operator-catalog-kf7rb" Apr 22 16:07:11.070507 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:11.070475 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxfdk\" (UniqueName: \"kubernetes.io/projected/e1115486-9ff4-4d75-bf4d-4a7216c6d270-kube-api-access-hxfdk\") pod \"kuadrant-operator-catalog-kf7rb\" (UID: \"e1115486-9ff4-4d75-bf4d-4a7216c6d270\") " pod="kuadrant-system/kuadrant-operator-catalog-kf7rb" Apr 22 16:07:11.078606 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:11.078582 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxfdk\" (UniqueName: \"kubernetes.io/projected/e1115486-9ff4-4d75-bf4d-4a7216c6d270-kube-api-access-hxfdk\") pod \"kuadrant-operator-catalog-kf7rb\" (UID: \"e1115486-9ff4-4d75-bf4d-4a7216c6d270\") " pod="kuadrant-system/kuadrant-operator-catalog-kf7rb" Apr 22 16:07:11.215549 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:11.215462 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-kf7rb" Apr 22 16:07:11.264305 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:11.264272 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-kf7rb"] Apr 22 16:07:11.343901 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:11.343864 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-kf7rb"] Apr 22 16:07:11.346792 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:07:11.346763 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1115486_9ff4_4d75_bf4d_4a7216c6d270.slice/crio-ba2907fa635e91dc73c4482df7f4e16dc8f709b4af3fbd50a32330d4c380cccd WatchSource:0}: Error finding container ba2907fa635e91dc73c4482df7f4e16dc8f709b4af3fbd50a32330d4c380cccd: Status 404 returned error can't find the container with id ba2907fa635e91dc73c4482df7f4e16dc8f709b4af3fbd50a32330d4c380cccd Apr 22 16:07:11.472148 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:11.472063 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-flrd4"] Apr 22 16:07:11.474818 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:11.474800 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-flrd4" Apr 22 16:07:11.481634 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:11.481609 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-flrd4"] Apr 22 16:07:11.574944 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:11.574914 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5rmc\" (UniqueName: \"kubernetes.io/projected/866a656c-4a80-40bb-9ca9-9162f1afea85-kube-api-access-t5rmc\") pod \"kuadrant-operator-catalog-flrd4\" (UID: \"866a656c-4a80-40bb-9ca9-9162f1afea85\") " pod="kuadrant-system/kuadrant-operator-catalog-flrd4" Apr 22 16:07:11.675546 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:11.675513 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5rmc\" (UniqueName: \"kubernetes.io/projected/866a656c-4a80-40bb-9ca9-9162f1afea85-kube-api-access-t5rmc\") pod \"kuadrant-operator-catalog-flrd4\" (UID: \"866a656c-4a80-40bb-9ca9-9162f1afea85\") " pod="kuadrant-system/kuadrant-operator-catalog-flrd4" Apr 22 16:07:11.683626 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:11.683596 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5rmc\" (UniqueName: \"kubernetes.io/projected/866a656c-4a80-40bb-9ca9-9162f1afea85-kube-api-access-t5rmc\") pod \"kuadrant-operator-catalog-flrd4\" (UID: \"866a656c-4a80-40bb-9ca9-9162f1afea85\") " pod="kuadrant-system/kuadrant-operator-catalog-flrd4" Apr 22 16:07:11.784749 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:11.784669 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-flrd4" Apr 22 16:07:11.928937 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:11.928909 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-flrd4"] Apr 22 16:07:11.931645 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:07:11.931620 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod866a656c_4a80_40bb_9ca9_9162f1afea85.slice/crio-e8e44a7aaa34b3793a164070efd061c2430635e5f11def17312a7d70383ea31f WatchSource:0}: Error finding container e8e44a7aaa34b3793a164070efd061c2430635e5f11def17312a7d70383ea31f: Status 404 returned error can't find the container with id e8e44a7aaa34b3793a164070efd061c2430635e5f11def17312a7d70383ea31f Apr 22 16:07:12.200719 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:12.200665 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-flrd4" event={"ID":"866a656c-4a80-40bb-9ca9-9162f1afea85","Type":"ContainerStarted","Data":"e8e44a7aaa34b3793a164070efd061c2430635e5f11def17312a7d70383ea31f"} Apr 22 16:07:12.201771 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:12.201741 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-kf7rb" event={"ID":"e1115486-9ff4-4d75-bf4d-4a7216c6d270","Type":"ContainerStarted","Data":"ba2907fa635e91dc73c4482df7f4e16dc8f709b4af3fbd50a32330d4c380cccd"} Apr 22 16:07:14.211934 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:14.211897 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-flrd4" event={"ID":"866a656c-4a80-40bb-9ca9-9162f1afea85","Type":"ContainerStarted","Data":"0097cb004d4eb16359fcfc9326d046a22e904ab55a132eecafa6482c0c4bdc68"} Apr 22 16:07:14.213149 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:14.213128 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-kf7rb" event={"ID":"e1115486-9ff4-4d75-bf4d-4a7216c6d270","Type":"ContainerStarted","Data":"cf08e815afe41b81c6fb596f6f2a98e17f79b0cc4d2709d9fa381726c17bcdb8"} Apr 22 16:07:14.213297 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:14.213245 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-kf7rb" podUID="e1115486-9ff4-4d75-bf4d-4a7216c6d270" containerName="registry-server" containerID="cri-o://cf08e815afe41b81c6fb596f6f2a98e17f79b0cc4d2709d9fa381726c17bcdb8" gracePeriod=2 Apr 22 16:07:14.226546 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:14.226497 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-flrd4" podStartSLOduration=1.688330152 podStartE2EDuration="3.22648351s" podCreationTimestamp="2026-04-22 16:07:11 +0000 UTC" firstStartedPulling="2026-04-22 16:07:11.933349981 +0000 UTC m=+512.832904030" lastFinishedPulling="2026-04-22 16:07:13.471503336 +0000 UTC m=+514.371057388" observedRunningTime="2026-04-22 16:07:14.225862293 +0000 UTC m=+515.125416364" watchObservedRunningTime="2026-04-22 16:07:14.22648351 +0000 UTC m=+515.126037559" Apr 22 16:07:14.240601 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:14.240546 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-kf7rb" podStartSLOduration=2.120374405 podStartE2EDuration="4.240530687s" podCreationTimestamp="2026-04-22 16:07:10 +0000 UTC" firstStartedPulling="2026-04-22 16:07:11.348719666 +0000 UTC m=+512.248273715" lastFinishedPulling="2026-04-22 16:07:13.468875944 +0000 UTC m=+514.368429997" observedRunningTime="2026-04-22 16:07:14.238621387 +0000 UTC m=+515.138175458" watchObservedRunningTime="2026-04-22 16:07:14.240530687 +0000 UTC m=+515.140084758" Apr 22 16:07:14.441445 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:14.441419 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-kf7rb" Apr 22 16:07:14.499815 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:14.499722 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxfdk\" (UniqueName: \"kubernetes.io/projected/e1115486-9ff4-4d75-bf4d-4a7216c6d270-kube-api-access-hxfdk\") pod \"e1115486-9ff4-4d75-bf4d-4a7216c6d270\" (UID: \"e1115486-9ff4-4d75-bf4d-4a7216c6d270\") " Apr 22 16:07:14.502009 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:14.501987 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1115486-9ff4-4d75-bf4d-4a7216c6d270-kube-api-access-hxfdk" (OuterVolumeSpecName: "kube-api-access-hxfdk") pod "e1115486-9ff4-4d75-bf4d-4a7216c6d270" (UID: "e1115486-9ff4-4d75-bf4d-4a7216c6d270"). InnerVolumeSpecName "kube-api-access-hxfdk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:07:14.600628 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:14.600593 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hxfdk\" (UniqueName: \"kubernetes.io/projected/e1115486-9ff4-4d75-bf4d-4a7216c6d270-kube-api-access-hxfdk\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:07:15.217457 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:15.217422 2572 generic.go:358] "Generic (PLEG): container finished" podID="e1115486-9ff4-4d75-bf4d-4a7216c6d270" containerID="cf08e815afe41b81c6fb596f6f2a98e17f79b0cc4d2709d9fa381726c17bcdb8" exitCode=0 Apr 22 16:07:15.217932 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:15.217481 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-kf7rb" Apr 22 16:07:15.217932 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:15.217508 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-kf7rb" event={"ID":"e1115486-9ff4-4d75-bf4d-4a7216c6d270","Type":"ContainerDied","Data":"cf08e815afe41b81c6fb596f6f2a98e17f79b0cc4d2709d9fa381726c17bcdb8"} Apr 22 16:07:15.217932 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:15.217550 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-kf7rb" event={"ID":"e1115486-9ff4-4d75-bf4d-4a7216c6d270","Type":"ContainerDied","Data":"ba2907fa635e91dc73c4482df7f4e16dc8f709b4af3fbd50a32330d4c380cccd"} Apr 22 16:07:15.217932 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:15.217572 2572 scope.go:117] "RemoveContainer" containerID="cf08e815afe41b81c6fb596f6f2a98e17f79b0cc4d2709d9fa381726c17bcdb8" Apr 22 16:07:15.227271 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:15.227253 2572 scope.go:117] "RemoveContainer" containerID="cf08e815afe41b81c6fb596f6f2a98e17f79b0cc4d2709d9fa381726c17bcdb8" Apr 22 16:07:15.227531 ip-10-0-135-9 kubenswrapper[2572]: E0422 16:07:15.227513 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf08e815afe41b81c6fb596f6f2a98e17f79b0cc4d2709d9fa381726c17bcdb8\": container with ID starting with cf08e815afe41b81c6fb596f6f2a98e17f79b0cc4d2709d9fa381726c17bcdb8 not found: ID does not exist" containerID="cf08e815afe41b81c6fb596f6f2a98e17f79b0cc4d2709d9fa381726c17bcdb8" Apr 22 16:07:15.227578 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:15.227541 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf08e815afe41b81c6fb596f6f2a98e17f79b0cc4d2709d9fa381726c17bcdb8"} err="failed to get container status \"cf08e815afe41b81c6fb596f6f2a98e17f79b0cc4d2709d9fa381726c17bcdb8\": rpc error: code = NotFound desc = could not find container \"cf08e815afe41b81c6fb596f6f2a98e17f79b0cc4d2709d9fa381726c17bcdb8\": container with ID starting with cf08e815afe41b81c6fb596f6f2a98e17f79b0cc4d2709d9fa381726c17bcdb8 not found: ID does not exist" Apr 22 16:07:15.237331 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:15.237305 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-kf7rb"] Apr 22 16:07:15.239574 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:15.239555 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-kf7rb"] Apr 22 16:07:15.644246 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:15.644214 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1115486-9ff4-4d75-bf4d-4a7216c6d270" path="/var/lib/kubelet/pods/e1115486-9ff4-4d75-bf4d-4a7216c6d270/volumes" Apr 22 16:07:21.785699 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:21.785652 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-flrd4" Apr 22 16:07:21.786162 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:21.785729 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-flrd4" Apr 22 16:07:21.807652 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:21.807625 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-flrd4" Apr 22 16:07:22.265911 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:22.265886 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-flrd4" Apr 22 16:07:26.100150 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:26.100115 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j"] Apr 22 16:07:26.100627 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:26.100430 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1115486-9ff4-4d75-bf4d-4a7216c6d270" containerName="registry-server" Apr 22 16:07:26.100627 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:26.100442 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1115486-9ff4-4d75-bf4d-4a7216c6d270" containerName="registry-server" Apr 22 16:07:26.100627 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:26.100488 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e1115486-9ff4-4d75-bf4d-4a7216c6d270" containerName="registry-server" Apr 22 16:07:26.109262 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:26.109236 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j" Apr 22 16:07:26.110538 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:26.110511 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j"] Apr 22 16:07:26.111517 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:26.111495 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-c25fv\"" Apr 22 16:07:26.189824 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:26.189785 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/957b9816-c3b2-4085-9c84-b215810dd6ca-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j\" (UID: \"957b9816-c3b2-4085-9c84-b215810dd6ca\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j" Apr 22 16:07:26.189995 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:26.189890 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/957b9816-c3b2-4085-9c84-b215810dd6ca-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j\" (UID: \"957b9816-c3b2-4085-9c84-b215810dd6ca\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j" Apr 22 16:07:26.189995 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:26.189925 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8njs\" (UniqueName: \"kubernetes.io/projected/957b9816-c3b2-4085-9c84-b215810dd6ca-kube-api-access-j8njs\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j\" (UID: \"957b9816-c3b2-4085-9c84-b215810dd6ca\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j" Apr 22 16:07:26.291263 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:26.291138 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/957b9816-c3b2-4085-9c84-b215810dd6ca-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j\" (UID: \"957b9816-c3b2-4085-9c84-b215810dd6ca\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j" Apr 22 16:07:26.291263 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:26.291236 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8njs\" (UniqueName: \"kubernetes.io/projected/957b9816-c3b2-4085-9c84-b215810dd6ca-kube-api-access-j8njs\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j\" (UID: \"957b9816-c3b2-4085-9c84-b215810dd6ca\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j" Apr 22 16:07:26.291507 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:26.291281 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/957b9816-c3b2-4085-9c84-b215810dd6ca-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j\" (UID: \"957b9816-c3b2-4085-9c84-b215810dd6ca\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j" Apr 22 16:07:26.291567 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:26.291550 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/957b9816-c3b2-4085-9c84-b215810dd6ca-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j\" (UID: \"957b9816-c3b2-4085-9c84-b215810dd6ca\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j" Apr 22 16:07:26.291621 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:26.291605 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/957b9816-c3b2-4085-9c84-b215810dd6ca-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j\" (UID: \"957b9816-c3b2-4085-9c84-b215810dd6ca\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j" Apr 22 16:07:26.298815 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:26.298792 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8njs\" (UniqueName: \"kubernetes.io/projected/957b9816-c3b2-4085-9c84-b215810dd6ca-kube-api-access-j8njs\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j\" (UID: \"957b9816-c3b2-4085-9c84-b215810dd6ca\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j" Apr 22 16:07:26.419869 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:26.419776 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j" Apr 22 16:07:26.551004 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:26.550969 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j"] Apr 22 16:07:26.553805 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:07:26.553762 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod957b9816_c3b2_4085_9c84_b215810dd6ca.slice/crio-02fc40a343edca3ebe4fcb2b464476de9f7191108c08e9d21cc7c9752a0c6396 WatchSource:0}: Error finding container 02fc40a343edca3ebe4fcb2b464476de9f7191108c08e9d21cc7c9752a0c6396: Status 404 returned error can't find the container with id 02fc40a343edca3ebe4fcb2b464476de9f7191108c08e9d21cc7c9752a0c6396 Apr 22 16:07:26.899272 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:26.899236 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62"] Apr 22 16:07:26.902802 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:26.902787 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62" Apr 22 16:07:26.909491 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:26.909455 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62"] Apr 22 16:07:26.996598 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:26.996564 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38c23430-348a-4872-9a3a-c52871fc1766-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62\" (UID: \"38c23430-348a-4872-9a3a-c52871fc1766\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62" Apr 22 16:07:26.996805 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:26.996625 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtjpp\" (UniqueName: \"kubernetes.io/projected/38c23430-348a-4872-9a3a-c52871fc1766-kube-api-access-mtjpp\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62\" (UID: \"38c23430-348a-4872-9a3a-c52871fc1766\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62" Apr 22 16:07:26.996805 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:26.996749 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38c23430-348a-4872-9a3a-c52871fc1766-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62\" (UID: \"38c23430-348a-4872-9a3a-c52871fc1766\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62" Apr 22 16:07:27.098147 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.098095 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38c23430-348a-4872-9a3a-c52871fc1766-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62\" (UID: \"38c23430-348a-4872-9a3a-c52871fc1766\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62" Apr 22 16:07:27.098147 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.098154 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtjpp\" (UniqueName: \"kubernetes.io/projected/38c23430-348a-4872-9a3a-c52871fc1766-kube-api-access-mtjpp\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62\" (UID: \"38c23430-348a-4872-9a3a-c52871fc1766\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62" Apr 22 16:07:27.098424 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.098182 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38c23430-348a-4872-9a3a-c52871fc1766-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62\" (UID: \"38c23430-348a-4872-9a3a-c52871fc1766\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62" Apr 22 16:07:27.098520 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.098500 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38c23430-348a-4872-9a3a-c52871fc1766-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62\" (UID: \"38c23430-348a-4872-9a3a-c52871fc1766\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62" Apr 22 16:07:27.098560 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.098541 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38c23430-348a-4872-9a3a-c52871fc1766-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62\" (UID: \"38c23430-348a-4872-9a3a-c52871fc1766\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62" Apr 22 16:07:27.105718 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.105691 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtjpp\" (UniqueName: \"kubernetes.io/projected/38c23430-348a-4872-9a3a-c52871fc1766-kube-api-access-mtjpp\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62\" (UID: \"38c23430-348a-4872-9a3a-c52871fc1766\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62" Apr 22 16:07:27.212925 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.212841 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62" Apr 22 16:07:27.262158 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.262126 2572 generic.go:358] "Generic (PLEG): container finished" podID="957b9816-c3b2-4085-9c84-b215810dd6ca" containerID="2c7961cd75924bb6d6722eea95e0fdc917144c923e9c8ae1d87be2ed3b0e31d5" exitCode=0 Apr 22 16:07:27.262334 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.262227 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j" event={"ID":"957b9816-c3b2-4085-9c84-b215810dd6ca","Type":"ContainerDied","Data":"2c7961cd75924bb6d6722eea95e0fdc917144c923e9c8ae1d87be2ed3b0e31d5"} Apr 22 16:07:27.262334 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.262263 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j" event={"ID":"957b9816-c3b2-4085-9c84-b215810dd6ca","Type":"ContainerStarted","Data":"02fc40a343edca3ebe4fcb2b464476de9f7191108c08e9d21cc7c9752a0c6396"} Apr 22 16:07:27.337811 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.337710 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62"] Apr 22 16:07:27.339955 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:07:27.339928 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38c23430_348a_4872_9a3a_c52871fc1766.slice/crio-42c357d48dc1c32e7eb9a4a404c52b4a0a9978c629ef5e178ecca8a385e1296e WatchSource:0}: Error finding container 42c357d48dc1c32e7eb9a4a404c52b4a0a9978c629ef5e178ecca8a385e1296e: Status 404 returned error can't find the container with id 42c357d48dc1c32e7eb9a4a404c52b4a0a9978c629ef5e178ecca8a385e1296e Apr 22 16:07:27.511607 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.511524 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l"] Apr 22 16:07:27.514787 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.514769 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l" Apr 22 16:07:27.521708 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.521682 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l"] Apr 22 16:07:27.602145 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.602106 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gvz4\" (UniqueName: \"kubernetes.io/projected/f54c5736-e675-4465-a724-f1b413683899-kube-api-access-7gvz4\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l\" (UID: \"f54c5736-e675-4465-a724-f1b413683899\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l" Apr 22 16:07:27.602145 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.602153 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f54c5736-e675-4465-a724-f1b413683899-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l\" (UID: \"f54c5736-e675-4465-a724-f1b413683899\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l" Apr 22 16:07:27.602391 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.602243 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f54c5736-e675-4465-a724-f1b413683899-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l\" (UID: \"f54c5736-e675-4465-a724-f1b413683899\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l" Apr 22 16:07:27.703243 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.703179 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gvz4\" (UniqueName: \"kubernetes.io/projected/f54c5736-e675-4465-a724-f1b413683899-kube-api-access-7gvz4\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l\" (UID: \"f54c5736-e675-4465-a724-f1b413683899\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l" Apr 22 16:07:27.703441 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.703253 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f54c5736-e675-4465-a724-f1b413683899-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l\" (UID: \"f54c5736-e675-4465-a724-f1b413683899\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l" Apr 22 16:07:27.703441 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.703308 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f54c5736-e675-4465-a724-f1b413683899-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l\" (UID: \"f54c5736-e675-4465-a724-f1b413683899\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l" Apr 22 16:07:27.703706 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.703684 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f54c5736-e675-4465-a724-f1b413683899-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l\" (UID: \"f54c5736-e675-4465-a724-f1b413683899\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l" Apr 22 16:07:27.703784 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.703722 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f54c5736-e675-4465-a724-f1b413683899-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l\" (UID: \"f54c5736-e675-4465-a724-f1b413683899\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l" Apr 22 16:07:27.711951 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.711917 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gvz4\" (UniqueName: \"kubernetes.io/projected/f54c5736-e675-4465-a724-f1b413683899-kube-api-access-7gvz4\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l\" (UID: \"f54c5736-e675-4465-a724-f1b413683899\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l" Apr 22 16:07:27.824932 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.824902 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l" Apr 22 16:07:27.905612 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.905582 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9"] Apr 22 16:07:27.910585 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.910555 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9" Apr 22 16:07:27.917108 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.917077 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9"] Apr 22 16:07:27.954536 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:27.954510 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l"] Apr 22 16:07:27.956729 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:07:27.956700 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf54c5736_e675_4465_a724_f1b413683899.slice/crio-3ac5ad0cbab5bd5ecb1d07dc82be88f5f74cf4ed70651c75529239fb36aa00d9 WatchSource:0}: Error finding container 3ac5ad0cbab5bd5ecb1d07dc82be88f5f74cf4ed70651c75529239fb36aa00d9: Status 404 returned error can't find the container with id 3ac5ad0cbab5bd5ecb1d07dc82be88f5f74cf4ed70651c75529239fb36aa00d9 Apr 22 16:07:28.006468 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:28.006437 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53b8f533-ce84-45f1-913e-fa4b16c3cea9-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9\" (UID: \"53b8f533-ce84-45f1-913e-fa4b16c3cea9\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9" Apr 22 16:07:28.006610 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:28.006497 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj4bl\" (UniqueName: \"kubernetes.io/projected/53b8f533-ce84-45f1-913e-fa4b16c3cea9-kube-api-access-qj4bl\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9\" (UID: \"53b8f533-ce84-45f1-913e-fa4b16c3cea9\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9" Apr 22 16:07:28.006610 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:28.006537 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53b8f533-ce84-45f1-913e-fa4b16c3cea9-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9\" (UID: \"53b8f533-ce84-45f1-913e-fa4b16c3cea9\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9" Apr 22 16:07:28.107184 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:28.107150 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53b8f533-ce84-45f1-913e-fa4b16c3cea9-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9\" (UID: \"53b8f533-ce84-45f1-913e-fa4b16c3cea9\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9" Apr 22 16:07:28.107577 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:28.107214 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qj4bl\" (UniqueName: \"kubernetes.io/projected/53b8f533-ce84-45f1-913e-fa4b16c3cea9-kube-api-access-qj4bl\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9\" (UID: \"53b8f533-ce84-45f1-913e-fa4b16c3cea9\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9" Apr 22 16:07:28.107577 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:28.107347 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53b8f533-ce84-45f1-913e-fa4b16c3cea9-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9\" (UID: \"53b8f533-ce84-45f1-913e-fa4b16c3cea9\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9" Apr 22 16:07:28.107577 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:28.107534 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53b8f533-ce84-45f1-913e-fa4b16c3cea9-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9\" (UID: \"53b8f533-ce84-45f1-913e-fa4b16c3cea9\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9" Apr 22 16:07:28.107737 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:28.107693 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53b8f533-ce84-45f1-913e-fa4b16c3cea9-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9\" (UID: \"53b8f533-ce84-45f1-913e-fa4b16c3cea9\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9" Apr 22 16:07:28.114995 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:28.114975 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj4bl\" (UniqueName: \"kubernetes.io/projected/53b8f533-ce84-45f1-913e-fa4b16c3cea9-kube-api-access-qj4bl\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9\" (UID: \"53b8f533-ce84-45f1-913e-fa4b16c3cea9\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9" Apr 22 16:07:28.231081 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:28.231046 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9" Apr 22 16:07:28.268473 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:28.268439 2572 generic.go:358] "Generic (PLEG): container finished" podID="957b9816-c3b2-4085-9c84-b215810dd6ca" containerID="4ea62f36eede80eccdcecba63fac576b9acbf43077f32009effd4dc2dfc4d482" exitCode=0 Apr 22 16:07:28.268654 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:28.268530 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j" event={"ID":"957b9816-c3b2-4085-9c84-b215810dd6ca","Type":"ContainerDied","Data":"4ea62f36eede80eccdcecba63fac576b9acbf43077f32009effd4dc2dfc4d482"} Apr 22 16:07:28.270028 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:28.270002 2572 generic.go:358] "Generic (PLEG): container finished" podID="38c23430-348a-4872-9a3a-c52871fc1766" containerID="b9bd6788ba997badf5375500a9fd1baf83ff0fda827ffbbc0abec1c2857fcfe5" exitCode=0 Apr 22 16:07:28.270136 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:28.270095 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62" event={"ID":"38c23430-348a-4872-9a3a-c52871fc1766","Type":"ContainerDied","Data":"b9bd6788ba997badf5375500a9fd1baf83ff0fda827ffbbc0abec1c2857fcfe5"} Apr 22 16:07:28.270136 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:28.270120 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62" event={"ID":"38c23430-348a-4872-9a3a-c52871fc1766","Type":"ContainerStarted","Data":"42c357d48dc1c32e7eb9a4a404c52b4a0a9978c629ef5e178ecca8a385e1296e"} Apr 22 16:07:28.271703 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:28.271682 2572 generic.go:358] "Generic (PLEG): container finished" podID="f54c5736-e675-4465-a724-f1b413683899" containerID="2f7c2fbad48b724b84bde2c6b5208654daa70703a1e5753482238a0445bf34a9" exitCode=0 Apr 22 16:07:28.271821 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:28.271771 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l" event={"ID":"f54c5736-e675-4465-a724-f1b413683899","Type":"ContainerDied","Data":"2f7c2fbad48b724b84bde2c6b5208654daa70703a1e5753482238a0445bf34a9"} Apr 22 16:07:28.271821 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:28.271801 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l" event={"ID":"f54c5736-e675-4465-a724-f1b413683899","Type":"ContainerStarted","Data":"3ac5ad0cbab5bd5ecb1d07dc82be88f5f74cf4ed70651c75529239fb36aa00d9"} Apr 22 16:07:28.365523 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:28.365500 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9"] Apr 22 16:07:28.368090 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:07:28.368058 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53b8f533_ce84_45f1_913e_fa4b16c3cea9.slice/crio-b650d43265d78b5d512041df3a857d5118575bc17c89c7b62aeb8d83025e5aa2 WatchSource:0}: Error finding container b650d43265d78b5d512041df3a857d5118575bc17c89c7b62aeb8d83025e5aa2: Status 404 returned error can't find the container with id b650d43265d78b5d512041df3a857d5118575bc17c89c7b62aeb8d83025e5aa2 Apr 22 16:07:29.276440 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:29.276408 2572 generic.go:358] "Generic (PLEG): container finished" podID="53b8f533-ce84-45f1-913e-fa4b16c3cea9" containerID="0fb58d96cc1237ed15beb3fc5c26fbc61ad0c9f54dc7835b57d36868905012dc" exitCode=0 Apr 22 16:07:29.276859 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:29.276502 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9" event={"ID":"53b8f533-ce84-45f1-913e-fa4b16c3cea9","Type":"ContainerDied","Data":"0fb58d96cc1237ed15beb3fc5c26fbc61ad0c9f54dc7835b57d36868905012dc"} Apr 22 16:07:29.276859 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:29.276545 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9" event={"ID":"53b8f533-ce84-45f1-913e-fa4b16c3cea9","Type":"ContainerStarted","Data":"b650d43265d78b5d512041df3a857d5118575bc17c89c7b62aeb8d83025e5aa2"} Apr 22 16:07:29.278646 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:29.278622 2572 generic.go:358] "Generic (PLEG): container finished" podID="957b9816-c3b2-4085-9c84-b215810dd6ca" containerID="030aa743b0487b32eb03a297e2a65684f5cc76a599510e179c71a8907d41797f" exitCode=0 Apr 22 16:07:29.278751 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:29.278667 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j" event={"ID":"957b9816-c3b2-4085-9c84-b215810dd6ca","Type":"ContainerDied","Data":"030aa743b0487b32eb03a297e2a65684f5cc76a599510e179c71a8907d41797f"} Apr 22 16:07:30.283488 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:30.283455 2572 generic.go:358] "Generic (PLEG): container finished" podID="38c23430-348a-4872-9a3a-c52871fc1766" containerID="1b5b3ebc12dcc915e039d52e5af0b12a7cca998228732772656d7825dc05af8a" exitCode=0 Apr 22 16:07:30.283925 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:30.283533 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62" event={"ID":"38c23430-348a-4872-9a3a-c52871fc1766","Type":"ContainerDied","Data":"1b5b3ebc12dcc915e039d52e5af0b12a7cca998228732772656d7825dc05af8a"} Apr 22 16:07:30.285075 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:30.285056 2572 generic.go:358] "Generic (PLEG): container finished" podID="f54c5736-e675-4465-a724-f1b413683899" containerID="7c9155b40b5b7310cb0e617a7ced20ab1696a7641231386376f9bb12eb97bc28" exitCode=0 Apr 22 16:07:30.285159 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:30.285135 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l" event={"ID":"f54c5736-e675-4465-a724-f1b413683899","Type":"ContainerDied","Data":"7c9155b40b5b7310cb0e617a7ced20ab1696a7641231386376f9bb12eb97bc28"} Apr 22 16:07:30.560548 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:30.560481 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j" Apr 22 16:07:30.628824 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:30.628784 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/957b9816-c3b2-4085-9c84-b215810dd6ca-util\") pod \"957b9816-c3b2-4085-9c84-b215810dd6ca\" (UID: \"957b9816-c3b2-4085-9c84-b215810dd6ca\") " Apr 22 16:07:30.628998 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:30.628870 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/957b9816-c3b2-4085-9c84-b215810dd6ca-bundle\") pod \"957b9816-c3b2-4085-9c84-b215810dd6ca\" (UID: \"957b9816-c3b2-4085-9c84-b215810dd6ca\") " Apr 22 16:07:30.628998 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:30.628908 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8njs\" (UniqueName: \"kubernetes.io/projected/957b9816-c3b2-4085-9c84-b215810dd6ca-kube-api-access-j8njs\") pod \"957b9816-c3b2-4085-9c84-b215810dd6ca\" (UID: \"957b9816-c3b2-4085-9c84-b215810dd6ca\") " Apr 22 16:07:30.629391 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:30.629368 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/957b9816-c3b2-4085-9c84-b215810dd6ca-bundle" (OuterVolumeSpecName: "bundle") pod "957b9816-c3b2-4085-9c84-b215810dd6ca" (UID: "957b9816-c3b2-4085-9c84-b215810dd6ca"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:07:30.631184 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:30.631164 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/957b9816-c3b2-4085-9c84-b215810dd6ca-kube-api-access-j8njs" (OuterVolumeSpecName: "kube-api-access-j8njs") pod "957b9816-c3b2-4085-9c84-b215810dd6ca" (UID: "957b9816-c3b2-4085-9c84-b215810dd6ca"). InnerVolumeSpecName "kube-api-access-j8njs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:07:30.634358 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:30.634332 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/957b9816-c3b2-4085-9c84-b215810dd6ca-util" (OuterVolumeSpecName: "util") pod "957b9816-c3b2-4085-9c84-b215810dd6ca" (UID: "957b9816-c3b2-4085-9c84-b215810dd6ca"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:07:30.730506 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:30.730453 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/957b9816-c3b2-4085-9c84-b215810dd6ca-bundle\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:07:30.730506 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:30.730504 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j8njs\" (UniqueName: \"kubernetes.io/projected/957b9816-c3b2-4085-9c84-b215810dd6ca-kube-api-access-j8njs\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:07:30.730506 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:30.730517 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/957b9816-c3b2-4085-9c84-b215810dd6ca-util\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:07:31.294672 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:31.294633 2572 generic.go:358] "Generic (PLEG): container finished" podID="f54c5736-e675-4465-a724-f1b413683899" containerID="6b3de03f954b173fb3116ee98fdc123b5035589cf9a3fb19319c7f0586353745" exitCode=0 Apr 22 16:07:31.295089 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:31.294703 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l" event={"ID":"f54c5736-e675-4465-a724-f1b413683899","Type":"ContainerDied","Data":"6b3de03f954b173fb3116ee98fdc123b5035589cf9a3fb19319c7f0586353745"} Apr 22 16:07:31.296211 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:31.296170 2572 generic.go:358] "Generic (PLEG): container finished" podID="53b8f533-ce84-45f1-913e-fa4b16c3cea9" containerID="2f6f2c1362a79bd79b66873225fd254e3296300ad8edbcf285f13bc8ae78e2c3" exitCode=0 Apr 22 16:07:31.296349 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:31.296222 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9" event={"ID":"53b8f533-ce84-45f1-913e-fa4b16c3cea9","Type":"ContainerDied","Data":"2f6f2c1362a79bd79b66873225fd254e3296300ad8edbcf285f13bc8ae78e2c3"} Apr 22 16:07:31.298061 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:31.298032 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j" event={"ID":"957b9816-c3b2-4085-9c84-b215810dd6ca","Type":"ContainerDied","Data":"02fc40a343edca3ebe4fcb2b464476de9f7191108c08e9d21cc7c9752a0c6396"} Apr 22 16:07:31.298061 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:31.298059 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02fc40a343edca3ebe4fcb2b464476de9f7191108c08e9d21cc7c9752a0c6396" Apr 22 16:07:31.298180 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:31.298059 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j" Apr 22 16:07:31.300094 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:31.300066 2572 generic.go:358] "Generic (PLEG): container finished" podID="38c23430-348a-4872-9a3a-c52871fc1766" containerID="ac3c69c198383c97a75aa41bc73e4a3435dcbc5d62815beee85de95857f761df" exitCode=0 Apr 22 16:07:31.300187 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:31.300104 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62" event={"ID":"38c23430-348a-4872-9a3a-c52871fc1766","Type":"ContainerDied","Data":"ac3c69c198383c97a75aa41bc73e4a3435dcbc5d62815beee85de95857f761df"} Apr 22 16:07:32.306155 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:32.306120 2572 generic.go:358] "Generic (PLEG): container finished" podID="53b8f533-ce84-45f1-913e-fa4b16c3cea9" containerID="b4adc92599e2ded69c7f0e411bcc513a3ce64b94d4c6a707fdb1a91bd84cdde0" exitCode=0 Apr 22 16:07:32.306605 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:32.306225 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9" event={"ID":"53b8f533-ce84-45f1-913e-fa4b16c3cea9","Type":"ContainerDied","Data":"b4adc92599e2ded69c7f0e411bcc513a3ce64b94d4c6a707fdb1a91bd84cdde0"} Apr 22 16:07:32.452055 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:32.452032 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62" Apr 22 16:07:32.455338 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:32.455320 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l" Apr 22 16:07:32.545723 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:32.545687 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtjpp\" (UniqueName: \"kubernetes.io/projected/38c23430-348a-4872-9a3a-c52871fc1766-kube-api-access-mtjpp\") pod \"38c23430-348a-4872-9a3a-c52871fc1766\" (UID: \"38c23430-348a-4872-9a3a-c52871fc1766\") " Apr 22 16:07:32.545723 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:32.545725 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f54c5736-e675-4465-a724-f1b413683899-util\") pod \"f54c5736-e675-4465-a724-f1b413683899\" (UID: \"f54c5736-e675-4465-a724-f1b413683899\") " Apr 22 16:07:32.545977 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:32.545752 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38c23430-348a-4872-9a3a-c52871fc1766-util\") pod \"38c23430-348a-4872-9a3a-c52871fc1766\" (UID: \"38c23430-348a-4872-9a3a-c52871fc1766\") " Apr 22 16:07:32.545977 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:32.545773 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gvz4\" (UniqueName: \"kubernetes.io/projected/f54c5736-e675-4465-a724-f1b413683899-kube-api-access-7gvz4\") pod \"f54c5736-e675-4465-a724-f1b413683899\" (UID: \"f54c5736-e675-4465-a724-f1b413683899\") " Apr 22 16:07:32.545977 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:32.545805 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f54c5736-e675-4465-a724-f1b413683899-bundle\") pod \"f54c5736-e675-4465-a724-f1b413683899\" (UID: \"f54c5736-e675-4465-a724-f1b413683899\") " Apr 22 16:07:32.545977 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:32.545863 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38c23430-348a-4872-9a3a-c52871fc1766-bundle\") pod \"38c23430-348a-4872-9a3a-c52871fc1766\" (UID: \"38c23430-348a-4872-9a3a-c52871fc1766\") " Apr 22 16:07:32.546434 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:32.546410 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f54c5736-e675-4465-a724-f1b413683899-bundle" (OuterVolumeSpecName: "bundle") pod "f54c5736-e675-4465-a724-f1b413683899" (UID: "f54c5736-e675-4465-a724-f1b413683899"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:07:32.546518 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:32.546438 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38c23430-348a-4872-9a3a-c52871fc1766-bundle" (OuterVolumeSpecName: "bundle") pod "38c23430-348a-4872-9a3a-c52871fc1766" (UID: "38c23430-348a-4872-9a3a-c52871fc1766"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:07:32.548087 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:32.548053 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38c23430-348a-4872-9a3a-c52871fc1766-kube-api-access-mtjpp" (OuterVolumeSpecName: "kube-api-access-mtjpp") pod "38c23430-348a-4872-9a3a-c52871fc1766" (UID: "38c23430-348a-4872-9a3a-c52871fc1766"). InnerVolumeSpecName "kube-api-access-mtjpp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:07:32.548368 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:32.548347 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f54c5736-e675-4465-a724-f1b413683899-kube-api-access-7gvz4" (OuterVolumeSpecName: "kube-api-access-7gvz4") pod "f54c5736-e675-4465-a724-f1b413683899" (UID: "f54c5736-e675-4465-a724-f1b413683899"). InnerVolumeSpecName "kube-api-access-7gvz4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:07:32.551909 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:32.551887 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38c23430-348a-4872-9a3a-c52871fc1766-util" (OuterVolumeSpecName: "util") pod "38c23430-348a-4872-9a3a-c52871fc1766" (UID: "38c23430-348a-4872-9a3a-c52871fc1766"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:07:32.552267 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:32.552247 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f54c5736-e675-4465-a724-f1b413683899-util" (OuterVolumeSpecName: "util") pod "f54c5736-e675-4465-a724-f1b413683899" (UID: "f54c5736-e675-4465-a724-f1b413683899"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:07:32.647051 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:32.647017 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38c23430-348a-4872-9a3a-c52871fc1766-util\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:07:32.647051 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:32.647047 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7gvz4\" (UniqueName: \"kubernetes.io/projected/f54c5736-e675-4465-a724-f1b413683899-kube-api-access-7gvz4\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:07:32.647051 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:32.647056 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f54c5736-e675-4465-a724-f1b413683899-bundle\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:07:32.647324 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:32.647065 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38c23430-348a-4872-9a3a-c52871fc1766-bundle\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:07:32.647324 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:32.647074 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mtjpp\" (UniqueName: \"kubernetes.io/projected/38c23430-348a-4872-9a3a-c52871fc1766-kube-api-access-mtjpp\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:07:32.647324 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:32.647082 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f54c5736-e675-4465-a724-f1b413683899-util\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:07:33.311067 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:33.311035 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62" Apr 22 16:07:33.311515 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:33.311034 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62" event={"ID":"38c23430-348a-4872-9a3a-c52871fc1766","Type":"ContainerDied","Data":"42c357d48dc1c32e7eb9a4a404c52b4a0a9978c629ef5e178ecca8a385e1296e"} Apr 22 16:07:33.311515 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:33.311144 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42c357d48dc1c32e7eb9a4a404c52b4a0a9978c629ef5e178ecca8a385e1296e" Apr 22 16:07:33.312772 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:33.312749 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l" Apr 22 16:07:33.312868 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:33.312766 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l" event={"ID":"f54c5736-e675-4465-a724-f1b413683899","Type":"ContainerDied","Data":"3ac5ad0cbab5bd5ecb1d07dc82be88f5f74cf4ed70651c75529239fb36aa00d9"} Apr 22 16:07:33.312868 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:33.312793 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ac5ad0cbab5bd5ecb1d07dc82be88f5f74cf4ed70651c75529239fb36aa00d9" Apr 22 16:07:33.437001 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:33.436980 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9" Apr 22 16:07:33.556158 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:33.556125 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53b8f533-ce84-45f1-913e-fa4b16c3cea9-bundle\") pod \"53b8f533-ce84-45f1-913e-fa4b16c3cea9\" (UID: \"53b8f533-ce84-45f1-913e-fa4b16c3cea9\") " Apr 22 16:07:33.556358 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:33.556245 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj4bl\" (UniqueName: \"kubernetes.io/projected/53b8f533-ce84-45f1-913e-fa4b16c3cea9-kube-api-access-qj4bl\") pod \"53b8f533-ce84-45f1-913e-fa4b16c3cea9\" (UID: \"53b8f533-ce84-45f1-913e-fa4b16c3cea9\") " Apr 22 16:07:33.556358 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:33.556278 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53b8f533-ce84-45f1-913e-fa4b16c3cea9-util\") pod \"53b8f533-ce84-45f1-913e-fa4b16c3cea9\" (UID: \"53b8f533-ce84-45f1-913e-fa4b16c3cea9\") " Apr 22 16:07:33.556755 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:33.556726 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53b8f533-ce84-45f1-913e-fa4b16c3cea9-bundle" (OuterVolumeSpecName: "bundle") pod "53b8f533-ce84-45f1-913e-fa4b16c3cea9" (UID: "53b8f533-ce84-45f1-913e-fa4b16c3cea9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:07:33.558665 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:33.558636 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53b8f533-ce84-45f1-913e-fa4b16c3cea9-kube-api-access-qj4bl" (OuterVolumeSpecName: "kube-api-access-qj4bl") pod "53b8f533-ce84-45f1-913e-fa4b16c3cea9" (UID: "53b8f533-ce84-45f1-913e-fa4b16c3cea9"). InnerVolumeSpecName "kube-api-access-qj4bl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:07:33.561714 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:33.561656 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53b8f533-ce84-45f1-913e-fa4b16c3cea9-util" (OuterVolumeSpecName: "util") pod "53b8f533-ce84-45f1-913e-fa4b16c3cea9" (UID: "53b8f533-ce84-45f1-913e-fa4b16c3cea9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:07:33.659481 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:33.659451 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qj4bl\" (UniqueName: \"kubernetes.io/projected/53b8f533-ce84-45f1-913e-fa4b16c3cea9-kube-api-access-qj4bl\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:07:33.659481 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:33.659477 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53b8f533-ce84-45f1-913e-fa4b16c3cea9-util\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:07:33.659671 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:33.659489 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53b8f533-ce84-45f1-913e-fa4b16c3cea9-bundle\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:07:34.317958 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:34.317925 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9" event={"ID":"53b8f533-ce84-45f1-913e-fa4b16c3cea9","Type":"ContainerDied","Data":"b650d43265d78b5d512041df3a857d5118575bc17c89c7b62aeb8d83025e5aa2"} Apr 22 16:07:34.317958 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:34.317958 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b650d43265d78b5d512041df3a857d5118575bc17c89c7b62aeb8d83025e5aa2" Apr 22 16:07:34.318371 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:34.318008 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9" Apr 22 16:07:44.143885 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.143845 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx"] Apr 22 16:07:44.144641 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.144614 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="957b9816-c3b2-4085-9c84-b215810dd6ca" containerName="util" Apr 22 16:07:44.144783 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.144773 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="957b9816-c3b2-4085-9c84-b215810dd6ca" containerName="util" Apr 22 16:07:44.144862 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.144842 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38c23430-348a-4872-9a3a-c52871fc1766" containerName="util" Apr 22 16:07:44.144862 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.144860 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="38c23430-348a-4872-9a3a-c52871fc1766" containerName="util" Apr 22 16:07:44.145045 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.144877 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53b8f533-ce84-45f1-913e-fa4b16c3cea9" containerName="util" Apr 22 16:07:44.145045 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.144886 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="53b8f533-ce84-45f1-913e-fa4b16c3cea9" containerName="util" Apr 22 16:07:44.145045 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.144897 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f54c5736-e675-4465-a724-f1b413683899" containerName="pull" Apr 22 16:07:44.145045 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.144904 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f54c5736-e675-4465-a724-f1b413683899" containerName="pull" Apr 22 16:07:44.145045 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.144914 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f54c5736-e675-4465-a724-f1b413683899" containerName="extract" Apr 22 16:07:44.145045 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.144925 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f54c5736-e675-4465-a724-f1b413683899" containerName="extract" Apr 22 16:07:44.145045 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.144934 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53b8f533-ce84-45f1-913e-fa4b16c3cea9" containerName="extract" Apr 22 16:07:44.145045 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.144941 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="53b8f533-ce84-45f1-913e-fa4b16c3cea9" containerName="extract" Apr 22 16:07:44.145045 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.144956 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="957b9816-c3b2-4085-9c84-b215810dd6ca" containerName="extract" Apr 22 16:07:44.145045 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.144964 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="957b9816-c3b2-4085-9c84-b215810dd6ca" containerName="extract" Apr 22 16:07:44.145045 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.144979 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f54c5736-e675-4465-a724-f1b413683899" containerName="util" Apr 22 16:07:44.145045 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.144987 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f54c5736-e675-4465-a724-f1b413683899" containerName="util" Apr 22 16:07:44.145045 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.144999 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38c23430-348a-4872-9a3a-c52871fc1766" containerName="pull" Apr 22 16:07:44.145045 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.145005 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="38c23430-348a-4872-9a3a-c52871fc1766" containerName="pull" Apr 22 16:07:44.145045 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.145012 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53b8f533-ce84-45f1-913e-fa4b16c3cea9" containerName="pull" Apr 22 16:07:44.145045 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.145020 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="53b8f533-ce84-45f1-913e-fa4b16c3cea9" containerName="pull" Apr 22 16:07:44.145045 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.145030 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="957b9816-c3b2-4085-9c84-b215810dd6ca" containerName="pull" Apr 22 16:07:44.145045 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.145037 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="957b9816-c3b2-4085-9c84-b215810dd6ca" containerName="pull" Apr 22 16:07:44.145045 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.145050 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38c23430-348a-4872-9a3a-c52871fc1766" containerName="extract" Apr 22 16:07:44.145606 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.145058 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="38c23430-348a-4872-9a3a-c52871fc1766" containerName="extract" Apr 22 16:07:44.145606 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.145126 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="957b9816-c3b2-4085-9c84-b215810dd6ca" containerName="extract" Apr 22 16:07:44.145606 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.145136 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="38c23430-348a-4872-9a3a-c52871fc1766" containerName="extract" Apr 22 16:07:44.145606 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.145142 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f54c5736-e675-4465-a724-f1b413683899" containerName="extract" Apr 22 16:07:44.145606 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.145148 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="53b8f533-ce84-45f1-913e-fa4b16c3cea9" containerName="extract" Apr 22 16:07:44.188013 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.187979 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx"] Apr 22 16:07:44.188152 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.188112 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx" Apr 22 16:07:44.190518 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.190497 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-ws92q\"" Apr 22 16:07:44.341627 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.341586 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g87r5\" (UniqueName: \"kubernetes.io/projected/b50e10fd-0de1-4080-886c-4501d5a189fb-kube-api-access-g87r5\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-t8qjx\" (UID: \"b50e10fd-0de1-4080-886c-4501d5a189fb\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx" Apr 22 16:07:44.341808 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.341704 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b50e10fd-0de1-4080-886c-4501d5a189fb-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-t8qjx\" (UID: \"b50e10fd-0de1-4080-886c-4501d5a189fb\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx" Apr 22 16:07:44.442349 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.442249 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b50e10fd-0de1-4080-886c-4501d5a189fb-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-t8qjx\" (UID: \"b50e10fd-0de1-4080-886c-4501d5a189fb\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx" Apr 22 16:07:44.442349 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.442299 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g87r5\" (UniqueName: \"kubernetes.io/projected/b50e10fd-0de1-4080-886c-4501d5a189fb-kube-api-access-g87r5\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-t8qjx\" (UID: \"b50e10fd-0de1-4080-886c-4501d5a189fb\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx" Apr 22 16:07:44.442644 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.442624 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b50e10fd-0de1-4080-886c-4501d5a189fb-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-t8qjx\" (UID: \"b50e10fd-0de1-4080-886c-4501d5a189fb\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx" Apr 22 16:07:44.454208 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.454175 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g87r5\" (UniqueName: \"kubernetes.io/projected/b50e10fd-0de1-4080-886c-4501d5a189fb-kube-api-access-g87r5\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-t8qjx\" (UID: \"b50e10fd-0de1-4080-886c-4501d5a189fb\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx" Apr 22 16:07:44.497746 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.497708 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx" Apr 22 16:07:44.623411 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:44.623275 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx"] Apr 22 16:07:44.626409 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:07:44.626379 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb50e10fd_0de1_4080_886c_4501d5a189fb.slice/crio-727abda28879152a0a7dc9a172d8438f91aba648def7c7d15cf1c2d6c91d562b WatchSource:0}: Error finding container 727abda28879152a0a7dc9a172d8438f91aba648def7c7d15cf1c2d6c91d562b: Status 404 returned error can't find the container with id 727abda28879152a0a7dc9a172d8438f91aba648def7c7d15cf1c2d6c91d562b Apr 22 16:07:45.354792 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:45.354756 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx" event={"ID":"b50e10fd-0de1-4080-886c-4501d5a189fb","Type":"ContainerStarted","Data":"727abda28879152a0a7dc9a172d8438f91aba648def7c7d15cf1c2d6c91d562b"} Apr 22 16:07:46.054935 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:46.054893 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-zvlgm"] Apr 22 16:07:46.057082 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:46.057059 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zvlgm" Apr 22 16:07:46.059545 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:46.059522 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 22 16:07:46.059647 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:46.059549 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-79csl\"" Apr 22 16:07:46.076583 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:46.076553 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-zvlgm"] Apr 22 16:07:46.158154 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:46.158117 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brmqx\" (UniqueName: \"kubernetes.io/projected/7d93114d-1c93-4d36-86b9-65f0997ff998-kube-api-access-brmqx\") pod \"dns-operator-controller-manager-648d5c98bc-zvlgm\" (UID: \"7d93114d-1c93-4d36-86b9-65f0997ff998\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zvlgm" Apr 22 16:07:46.259404 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:46.259364 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brmqx\" (UniqueName: \"kubernetes.io/projected/7d93114d-1c93-4d36-86b9-65f0997ff998-kube-api-access-brmqx\") pod \"dns-operator-controller-manager-648d5c98bc-zvlgm\" (UID: \"7d93114d-1c93-4d36-86b9-65f0997ff998\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zvlgm" Apr 22 16:07:46.273472 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:46.273433 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brmqx\" (UniqueName: \"kubernetes.io/projected/7d93114d-1c93-4d36-86b9-65f0997ff998-kube-api-access-brmqx\") pod \"dns-operator-controller-manager-648d5c98bc-zvlgm\" (UID: \"7d93114d-1c93-4d36-86b9-65f0997ff998\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zvlgm" Apr 22 16:07:46.366915 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:46.366876 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zvlgm" Apr 22 16:07:46.512409 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:46.512375 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-zvlgm"] Apr 22 16:07:46.516310 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:07:46.516263 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d93114d_1c93_4d36_86b9_65f0997ff998.slice/crio-1fd97ca7d4e378ca300722b36538bd9088fd642344b5848fcb6ed01fe7676f0b WatchSource:0}: Error finding container 1fd97ca7d4e378ca300722b36538bd9088fd642344b5848fcb6ed01fe7676f0b: Status 404 returned error can't find the container with id 1fd97ca7d4e378ca300722b36538bd9088fd642344b5848fcb6ed01fe7676f0b Apr 22 16:07:47.363660 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:47.363626 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zvlgm" event={"ID":"7d93114d-1c93-4d36-86b9-65f0997ff998","Type":"ContainerStarted","Data":"1fd97ca7d4e378ca300722b36538bd9088fd642344b5848fcb6ed01fe7676f0b"} Apr 22 16:07:49.423763 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:49.423729 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-qtr2m"] Apr 22 16:07:49.425873 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:49.425857 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-qtr2m" Apr 22 16:07:49.428253 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:49.428230 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-9dwtn\"" Apr 22 16:07:49.437170 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:49.437145 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-qtr2m"] Apr 22 16:07:49.589277 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:49.589243 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnwrp\" (UniqueName: \"kubernetes.io/projected/22a670b9-8247-4770-8552-d1c70e953050-kube-api-access-bnwrp\") pod \"authorino-operator-657f44b778-qtr2m\" (UID: \"22a670b9-8247-4770-8552-d1c70e953050\") " pod="kuadrant-system/authorino-operator-657f44b778-qtr2m" Apr 22 16:07:49.689821 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:49.689708 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bnwrp\" (UniqueName: \"kubernetes.io/projected/22a670b9-8247-4770-8552-d1c70e953050-kube-api-access-bnwrp\") pod \"authorino-operator-657f44b778-qtr2m\" (UID: \"22a670b9-8247-4770-8552-d1c70e953050\") " pod="kuadrant-system/authorino-operator-657f44b778-qtr2m" Apr 22 16:07:49.703477 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:49.703444 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnwrp\" (UniqueName: \"kubernetes.io/projected/22a670b9-8247-4770-8552-d1c70e953050-kube-api-access-bnwrp\") pod \"authorino-operator-657f44b778-qtr2m\" (UID: \"22a670b9-8247-4770-8552-d1c70e953050\") " pod="kuadrant-system/authorino-operator-657f44b778-qtr2m" Apr 22 16:07:49.737315 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:49.737279 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-qtr2m" Apr 22 16:07:49.883481 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:49.883449 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-qtr2m"] Apr 22 16:07:50.376889 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:50.376846 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zvlgm" event={"ID":"7d93114d-1c93-4d36-86b9-65f0997ff998","Type":"ContainerStarted","Data":"ac4abd0afbcd5fb1b9617d63e4303eeb08587bd6d549304c05f769cccdc74dc2"} Apr 22 16:07:50.377093 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:50.377028 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zvlgm" Apr 22 16:07:50.400112 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:50.400047 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zvlgm" podStartSLOduration=1.576131391 podStartE2EDuration="4.400025255s" podCreationTimestamp="2026-04-22 16:07:46 +0000 UTC" firstStartedPulling="2026-04-22 16:07:46.518862122 +0000 UTC m=+547.418416171" lastFinishedPulling="2026-04-22 16:07:49.342755983 +0000 UTC m=+550.242310035" observedRunningTime="2026-04-22 16:07:50.39779948 +0000 UTC m=+551.297353561" watchObservedRunningTime="2026-04-22 16:07:50.400025255 +0000 UTC m=+551.299579327" Apr 22 16:07:51.026164 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:07:51.026130 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22a670b9_8247_4770_8552_d1c70e953050.slice/crio-bec907a3b2d861e8fa65963ebccc490846fabc9690432fbc6149b2eff59fb2df WatchSource:0}: Error finding container bec907a3b2d861e8fa65963ebccc490846fabc9690432fbc6149b2eff59fb2df: Status 404 returned error can't find the container with id bec907a3b2d861e8fa65963ebccc490846fabc9690432fbc6149b2eff59fb2df Apr 22 16:07:51.381732 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:51.381695 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-qtr2m" event={"ID":"22a670b9-8247-4770-8552-d1c70e953050","Type":"ContainerStarted","Data":"bec907a3b2d861e8fa65963ebccc490846fabc9690432fbc6149b2eff59fb2df"} Apr 22 16:07:51.383258 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:51.383227 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx" event={"ID":"b50e10fd-0de1-4080-886c-4501d5a189fb","Type":"ContainerStarted","Data":"413b06b1b6afc61f99eba0acea9c862d329a8153d01387e4c8763ce9f74f3a8e"} Apr 22 16:07:51.383400 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:51.383287 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx" Apr 22 16:07:51.400160 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:51.400112 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx" podStartSLOduration=0.96680656 podStartE2EDuration="7.400098526s" podCreationTimestamp="2026-04-22 16:07:44 +0000 UTC" firstStartedPulling="2026-04-22 16:07:44.628678999 +0000 UTC m=+545.528233048" lastFinishedPulling="2026-04-22 16:07:51.061970964 +0000 UTC m=+551.961525014" observedRunningTime="2026-04-22 16:07:51.39809155 +0000 UTC m=+552.297645631" watchObservedRunningTime="2026-04-22 16:07:51.400098526 +0000 UTC m=+552.299652596" Apr 22 16:07:54.394800 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:54.394761 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-qtr2m" event={"ID":"22a670b9-8247-4770-8552-d1c70e953050","Type":"ContainerStarted","Data":"43ace62232e7e14545660d842bf496590527dd55fdb9fab7c7810b7bf0482e0f"} Apr 22 16:07:54.395240 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:54.394939 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-qtr2m" Apr 22 16:07:54.414619 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:07:54.414564 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-qtr2m" podStartSLOduration=2.441033541 podStartE2EDuration="5.414546832s" podCreationTimestamp="2026-04-22 16:07:49 +0000 UTC" firstStartedPulling="2026-04-22 16:07:51.028773626 +0000 UTC m=+551.928327679" lastFinishedPulling="2026-04-22 16:07:54.002286916 +0000 UTC m=+554.901840970" observedRunningTime="2026-04-22 16:07:54.412731278 +0000 UTC m=+555.312285349" watchObservedRunningTime="2026-04-22 16:07:54.414546832 +0000 UTC m=+555.314100905" Apr 22 16:08:01.385380 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:01.385352 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zvlgm" Apr 22 16:08:02.388441 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:02.388410 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx" Apr 22 16:08:04.118180 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.118143 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx"] Apr 22 16:08:04.118624 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.118395 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx" podUID="b50e10fd-0de1-4080-886c-4501d5a189fb" containerName="manager" containerID="cri-o://413b06b1b6afc61f99eba0acea9c862d329a8153d01387e4c8763ce9f74f3a8e" gracePeriod=2 Apr 22 16:08:04.120870 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.120837 2572 status_manager.go:895] "Failed to get status for pod" podUID="b50e10fd-0de1-4080-886c-4501d5a189fb" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-t8qjx\" is forbidden: User \"system:node:ip-10-0-135-9.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-9.ec2.internal' and this object" Apr 22 16:08:04.121909 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.121878 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx"] Apr 22 16:08:04.141185 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.141158 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thg5w"] Apr 22 16:08:04.141515 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.141496 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b50e10fd-0de1-4080-886c-4501d5a189fb" containerName="manager" Apr 22 16:08:04.141610 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.141517 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b50e10fd-0de1-4080-886c-4501d5a189fb" containerName="manager" Apr 22 16:08:04.141684 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.141612 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b50e10fd-0de1-4080-886c-4501d5a189fb" containerName="manager" Apr 22 16:08:04.143484 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.143465 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thg5w" Apr 22 16:08:04.153379 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.153349 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thg5w"] Apr 22 16:08:04.170991 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.170954 2572 status_manager.go:895] "Failed to get status for pod" podUID="b50e10fd-0de1-4080-886c-4501d5a189fb" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-t8qjx\" is forbidden: User \"system:node:ip-10-0-135-9.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-9.ec2.internal' and this object" Apr 22 16:08:04.318857 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.318820 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/01e9151d-e357-44e3-89e9-c35fbaad7e56-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-thg5w\" (UID: \"01e9151d-e357-44e3-89e9-c35fbaad7e56\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thg5w" Apr 22 16:08:04.319061 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.318951 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcjrk\" (UniqueName: \"kubernetes.io/projected/01e9151d-e357-44e3-89e9-c35fbaad7e56-kube-api-access-tcjrk\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-thg5w\" (UID: \"01e9151d-e357-44e3-89e9-c35fbaad7e56\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thg5w" Apr 22 16:08:04.343829 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.343802 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx" Apr 22 16:08:04.346041 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.346017 2572 status_manager.go:895] "Failed to get status for pod" podUID="b50e10fd-0de1-4080-886c-4501d5a189fb" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-t8qjx\" is forbidden: User \"system:node:ip-10-0-135-9.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-9.ec2.internal' and this object" Apr 22 16:08:04.419810 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.419733 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/01e9151d-e357-44e3-89e9-c35fbaad7e56-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-thg5w\" (UID: \"01e9151d-e357-44e3-89e9-c35fbaad7e56\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thg5w" Apr 22 16:08:04.419810 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.419802 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcjrk\" (UniqueName: \"kubernetes.io/projected/01e9151d-e357-44e3-89e9-c35fbaad7e56-kube-api-access-tcjrk\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-thg5w\" (UID: \"01e9151d-e357-44e3-89e9-c35fbaad7e56\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thg5w" Apr 22 16:08:04.420169 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.420147 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/01e9151d-e357-44e3-89e9-c35fbaad7e56-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-thg5w\" (UID: \"01e9151d-e357-44e3-89e9-c35fbaad7e56\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thg5w" Apr 22 16:08:04.429658 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.429621 2572 generic.go:358] "Generic (PLEG): container finished" podID="b50e10fd-0de1-4080-886c-4501d5a189fb" containerID="413b06b1b6afc61f99eba0acea9c862d329a8153d01387e4c8763ce9f74f3a8e" exitCode=0 Apr 22 16:08:04.429808 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.429678 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx" Apr 22 16:08:04.429808 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.429724 2572 scope.go:117] "RemoveContainer" containerID="413b06b1b6afc61f99eba0acea9c862d329a8153d01387e4c8763ce9f74f3a8e" Apr 22 16:08:04.431606 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.431575 2572 status_manager.go:895] "Failed to get status for pod" podUID="b50e10fd-0de1-4080-886c-4501d5a189fb" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-t8qjx\" is forbidden: User \"system:node:ip-10-0-135-9.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-9.ec2.internal' and this object" Apr 22 16:08:04.432794 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.432773 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcjrk\" (UniqueName: \"kubernetes.io/projected/01e9151d-e357-44e3-89e9-c35fbaad7e56-kube-api-access-tcjrk\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-thg5w\" (UID: \"01e9151d-e357-44e3-89e9-c35fbaad7e56\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thg5w" Apr 22 16:08:04.442598 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.442580 2572 scope.go:117] "RemoveContainer" containerID="413b06b1b6afc61f99eba0acea9c862d329a8153d01387e4c8763ce9f74f3a8e" Apr 22 16:08:04.442871 ip-10-0-135-9 kubenswrapper[2572]: E0422 16:08:04.442854 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"413b06b1b6afc61f99eba0acea9c862d329a8153d01387e4c8763ce9f74f3a8e\": container with ID starting with 413b06b1b6afc61f99eba0acea9c862d329a8153d01387e4c8763ce9f74f3a8e not found: ID does not exist" containerID="413b06b1b6afc61f99eba0acea9c862d329a8153d01387e4c8763ce9f74f3a8e" Apr 22 16:08:04.442924 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.442878 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"413b06b1b6afc61f99eba0acea9c862d329a8153d01387e4c8763ce9f74f3a8e"} err="failed to get container status \"413b06b1b6afc61f99eba0acea9c862d329a8153d01387e4c8763ce9f74f3a8e\": rpc error: code = NotFound desc = could not find container \"413b06b1b6afc61f99eba0acea9c862d329a8153d01387e4c8763ce9f74f3a8e\": container with ID starting with 413b06b1b6afc61f99eba0acea9c862d329a8153d01387e4c8763ce9f74f3a8e not found: ID does not exist" Apr 22 16:08:04.500338 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.500304 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thg5w" Apr 22 16:08:04.521160 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.521127 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b50e10fd-0de1-4080-886c-4501d5a189fb-extensions-socket-volume\") pod \"b50e10fd-0de1-4080-886c-4501d5a189fb\" (UID: \"b50e10fd-0de1-4080-886c-4501d5a189fb\") " Apr 22 16:08:04.521291 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.521250 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g87r5\" (UniqueName: \"kubernetes.io/projected/b50e10fd-0de1-4080-886c-4501d5a189fb-kube-api-access-g87r5\") pod \"b50e10fd-0de1-4080-886c-4501d5a189fb\" (UID: \"b50e10fd-0de1-4080-886c-4501d5a189fb\") " Apr 22 16:08:04.521623 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.521595 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b50e10fd-0de1-4080-886c-4501d5a189fb-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "b50e10fd-0de1-4080-886c-4501d5a189fb" (UID: "b50e10fd-0de1-4080-886c-4501d5a189fb"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:08:04.523547 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.523525 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b50e10fd-0de1-4080-886c-4501d5a189fb-kube-api-access-g87r5" (OuterVolumeSpecName: "kube-api-access-g87r5") pod "b50e10fd-0de1-4080-886c-4501d5a189fb" (UID: "b50e10fd-0de1-4080-886c-4501d5a189fb"). InnerVolumeSpecName "kube-api-access-g87r5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:08:04.622992 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.622959 2572 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b50e10fd-0de1-4080-886c-4501d5a189fb-extensions-socket-volume\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:08:04.622992 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.622996 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g87r5\" (UniqueName: \"kubernetes.io/projected/b50e10fd-0de1-4080-886c-4501d5a189fb-kube-api-access-g87r5\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:08:04.658566 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.658542 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thg5w"] Apr 22 16:08:04.660754 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:08:04.660728 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01e9151d_e357_44e3_89e9_c35fbaad7e56.slice/crio-62ed242ab94347bc20938ca5c9733238854077ac4b2ab7c69c510df937fc327f WatchSource:0}: Error finding container 62ed242ab94347bc20938ca5c9733238854077ac4b2ab7c69c510df937fc327f: Status 404 returned error can't find the container with id 62ed242ab94347bc20938ca5c9733238854077ac4b2ab7c69c510df937fc327f Apr 22 16:08:04.740418 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:04.740381 2572 status_manager.go:895] "Failed to get status for pod" podUID="b50e10fd-0de1-4080-886c-4501d5a189fb" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-t8qjx\" is forbidden: User \"system:node:ip-10-0-135-9.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-9.ec2.internal' and this object" Apr 22 16:08:05.400467 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:05.400380 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-qtr2m" Apr 22 16:08:05.403177 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:05.403153 2572 status_manager.go:895] "Failed to get status for pod" podUID="b50e10fd-0de1-4080-886c-4501d5a189fb" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-t8qjx\" is forbidden: User \"system:node:ip-10-0-135-9.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-9.ec2.internal' and this object" Apr 22 16:08:05.436765 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:05.436730 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thg5w" event={"ID":"01e9151d-e357-44e3-89e9-c35fbaad7e56","Type":"ContainerStarted","Data":"131f5f23cdcb99336d4d335e081a3a3e680b86e8723cd32feeeabe814163cf0b"} Apr 22 16:08:05.436765 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:05.436768 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thg5w" event={"ID":"01e9151d-e357-44e3-89e9-c35fbaad7e56","Type":"ContainerStarted","Data":"62ed242ab94347bc20938ca5c9733238854077ac4b2ab7c69c510df937fc327f"} Apr 22 16:08:05.437015 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:05.436870 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thg5w" Apr 22 16:08:05.439167 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:05.439129 2572 status_manager.go:895] "Failed to get status for pod" podUID="b50e10fd-0de1-4080-886c-4501d5a189fb" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-t8qjx" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-t8qjx\" is forbidden: User \"system:node:ip-10-0-135-9.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-9.ec2.internal' and this object" Apr 22 16:08:05.465047 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:05.464995 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thg5w" podStartSLOduration=1.464977127 podStartE2EDuration="1.464977127s" podCreationTimestamp="2026-04-22 16:08:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:08:05.463365214 +0000 UTC m=+566.362919284" watchObservedRunningTime="2026-04-22 16:08:05.464977127 +0000 UTC m=+566.364531197" Apr 22 16:08:05.645524 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:05.645483 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b50e10fd-0de1-4080-886c-4501d5a189fb" path="/var/lib/kubelet/pods/b50e10fd-0de1-4080-886c-4501d5a189fb/volumes" Apr 22 16:08:16.442289 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:16.442254 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thg5w" Apr 22 16:08:19.672866 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:19.672820 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thg5w"] Apr 22 16:08:19.673451 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:19.673145 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thg5w" podUID="01e9151d-e357-44e3-89e9-c35fbaad7e56" containerName="manager" containerID="cri-o://131f5f23cdcb99336d4d335e081a3a3e680b86e8723cd32feeeabe814163cf0b" gracePeriod=10 Apr 22 16:08:19.888500 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:19.888469 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tbrwn"] Apr 22 16:08:19.892700 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:19.892683 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tbrwn" Apr 22 16:08:19.902849 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:19.902822 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tbrwn"] Apr 22 16:08:19.920288 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:19.920254 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thg5w" Apr 22 16:08:19.927145 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:19.927084 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcjrk\" (UniqueName: \"kubernetes.io/projected/01e9151d-e357-44e3-89e9-c35fbaad7e56-kube-api-access-tcjrk\") pod \"01e9151d-e357-44e3-89e9-c35fbaad7e56\" (UID: \"01e9151d-e357-44e3-89e9-c35fbaad7e56\") " Apr 22 16:08:19.927145 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:19.927123 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/01e9151d-e357-44e3-89e9-c35fbaad7e56-extensions-socket-volume\") pod \"01e9151d-e357-44e3-89e9-c35fbaad7e56\" (UID: \"01e9151d-e357-44e3-89e9-c35fbaad7e56\") " Apr 22 16:08:19.927356 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:19.927213 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pngwj\" (UniqueName: \"kubernetes.io/projected/cd66014d-0f3d-4db1-8b88-684f4d91bebf-kube-api-access-pngwj\") pod \"kuadrant-operator-controller-manager-55c7f4c975-tbrwn\" (UID: \"cd66014d-0f3d-4db1-8b88-684f4d91bebf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tbrwn" Apr 22 16:08:19.927356 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:19.927266 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/cd66014d-0f3d-4db1-8b88-684f4d91bebf-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-tbrwn\" (UID: \"cd66014d-0f3d-4db1-8b88-684f4d91bebf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tbrwn" Apr 22 16:08:19.927559 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:19.927537 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01e9151d-e357-44e3-89e9-c35fbaad7e56-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "01e9151d-e357-44e3-89e9-c35fbaad7e56" (UID: "01e9151d-e357-44e3-89e9-c35fbaad7e56"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:08:19.929286 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:19.929262 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01e9151d-e357-44e3-89e9-c35fbaad7e56-kube-api-access-tcjrk" (OuterVolumeSpecName: "kube-api-access-tcjrk") pod "01e9151d-e357-44e3-89e9-c35fbaad7e56" (UID: "01e9151d-e357-44e3-89e9-c35fbaad7e56"). InnerVolumeSpecName "kube-api-access-tcjrk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:08:20.027628 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:20.027587 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pngwj\" (UniqueName: \"kubernetes.io/projected/cd66014d-0f3d-4db1-8b88-684f4d91bebf-kube-api-access-pngwj\") pod \"kuadrant-operator-controller-manager-55c7f4c975-tbrwn\" (UID: \"cd66014d-0f3d-4db1-8b88-684f4d91bebf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tbrwn" Apr 22 16:08:20.027812 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:20.027674 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/cd66014d-0f3d-4db1-8b88-684f4d91bebf-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-tbrwn\" (UID: \"cd66014d-0f3d-4db1-8b88-684f4d91bebf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tbrwn" Apr 22 16:08:20.027812 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:20.027707 2572 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/01e9151d-e357-44e3-89e9-c35fbaad7e56-extensions-socket-volume\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:08:20.027812 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:20.027717 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tcjrk\" (UniqueName: \"kubernetes.io/projected/01e9151d-e357-44e3-89e9-c35fbaad7e56-kube-api-access-tcjrk\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:08:20.028067 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:20.028050 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/cd66014d-0f3d-4db1-8b88-684f4d91bebf-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-tbrwn\" (UID: \"cd66014d-0f3d-4db1-8b88-684f4d91bebf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tbrwn" Apr 22 16:08:20.036387 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:20.036353 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pngwj\" (UniqueName: \"kubernetes.io/projected/cd66014d-0f3d-4db1-8b88-684f4d91bebf-kube-api-access-pngwj\") pod \"kuadrant-operator-controller-manager-55c7f4c975-tbrwn\" (UID: \"cd66014d-0f3d-4db1-8b88-684f4d91bebf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tbrwn" Apr 22 16:08:20.204227 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:20.204112 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tbrwn" Apr 22 16:08:20.332223 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:20.332164 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tbrwn"] Apr 22 16:08:20.334985 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:08:20.334952 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd66014d_0f3d_4db1_8b88_684f4d91bebf.slice/crio-0ff169be7cb07a251a6bc65e3780e583bd0715a2ebafc6c2e94e1212d22a68fb WatchSource:0}: Error finding container 0ff169be7cb07a251a6bc65e3780e583bd0715a2ebafc6c2e94e1212d22a68fb: Status 404 returned error can't find the container with id 0ff169be7cb07a251a6bc65e3780e583bd0715a2ebafc6c2e94e1212d22a68fb Apr 22 16:08:20.493069 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:20.492975 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tbrwn" event={"ID":"cd66014d-0f3d-4db1-8b88-684f4d91bebf","Type":"ContainerStarted","Data":"90f6c5eb9d39dc2439ccfc18b15eda18ca4a3a4342bdb0f24858fa97003380fb"} Apr 22 16:08:20.493069 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:20.493021 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tbrwn" event={"ID":"cd66014d-0f3d-4db1-8b88-684f4d91bebf","Type":"ContainerStarted","Data":"0ff169be7cb07a251a6bc65e3780e583bd0715a2ebafc6c2e94e1212d22a68fb"} Apr 22 16:08:20.493340 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:20.493090 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tbrwn" Apr 22 16:08:20.494267 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:20.494236 2572 generic.go:358] "Generic (PLEG): container finished" podID="01e9151d-e357-44e3-89e9-c35fbaad7e56" containerID="131f5f23cdcb99336d4d335e081a3a3e680b86e8723cd32feeeabe814163cf0b" exitCode=0 Apr 22 16:08:20.494382 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:20.494297 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thg5w" Apr 22 16:08:20.494382 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:20.494350 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thg5w" event={"ID":"01e9151d-e357-44e3-89e9-c35fbaad7e56","Type":"ContainerDied","Data":"131f5f23cdcb99336d4d335e081a3a3e680b86e8723cd32feeeabe814163cf0b"} Apr 22 16:08:20.494382 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:20.494379 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thg5w" event={"ID":"01e9151d-e357-44e3-89e9-c35fbaad7e56","Type":"ContainerDied","Data":"62ed242ab94347bc20938ca5c9733238854077ac4b2ab7c69c510df937fc327f"} Apr 22 16:08:20.494482 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:20.494398 2572 scope.go:117] "RemoveContainer" containerID="131f5f23cdcb99336d4d335e081a3a3e680b86e8723cd32feeeabe814163cf0b" Apr 22 16:08:20.503772 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:20.503752 2572 scope.go:117] "RemoveContainer" containerID="131f5f23cdcb99336d4d335e081a3a3e680b86e8723cd32feeeabe814163cf0b" Apr 22 16:08:20.504058 ip-10-0-135-9 kubenswrapper[2572]: E0422 16:08:20.504042 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"131f5f23cdcb99336d4d335e081a3a3e680b86e8723cd32feeeabe814163cf0b\": container with ID starting with 131f5f23cdcb99336d4d335e081a3a3e680b86e8723cd32feeeabe814163cf0b not found: ID does not exist" containerID="131f5f23cdcb99336d4d335e081a3a3e680b86e8723cd32feeeabe814163cf0b" Apr 22 16:08:20.504107 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:20.504068 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131f5f23cdcb99336d4d335e081a3a3e680b86e8723cd32feeeabe814163cf0b"} err="failed to get container status \"131f5f23cdcb99336d4d335e081a3a3e680b86e8723cd32feeeabe814163cf0b\": rpc error: code = NotFound desc = could not find container \"131f5f23cdcb99336d4d335e081a3a3e680b86e8723cd32feeeabe814163cf0b\": container with ID starting with 131f5f23cdcb99336d4d335e081a3a3e680b86e8723cd32feeeabe814163cf0b not found: ID does not exist" Apr 22 16:08:20.514936 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:20.514887 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tbrwn" podStartSLOduration=1.514871636 podStartE2EDuration="1.514871636s" podCreationTimestamp="2026-04-22 16:08:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:08:20.512287568 +0000 UTC m=+581.411841665" watchObservedRunningTime="2026-04-22 16:08:20.514871636 +0000 UTC m=+581.414425707" Apr 22 16:08:20.530750 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:20.530712 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thg5w"] Apr 22 16:08:20.534865 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:20.534820 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thg5w"] Apr 22 16:08:21.645219 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:21.645169 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01e9151d-e357-44e3-89e9-c35fbaad7e56" path="/var/lib/kubelet/pods/01e9151d-e357-44e3-89e9-c35fbaad7e56/volumes" Apr 22 16:08:31.501125 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:31.501089 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tbrwn" Apr 22 16:08:35.887124 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:35.887095 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6"] Apr 22 16:08:35.887519 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:35.887439 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01e9151d-e357-44e3-89e9-c35fbaad7e56" containerName="manager" Apr 22 16:08:35.887519 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:35.887454 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e9151d-e357-44e3-89e9-c35fbaad7e56" containerName="manager" Apr 22 16:08:35.887519 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:35.887501 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="01e9151d-e357-44e3-89e9-c35fbaad7e56" containerName="manager" Apr 22 16:08:35.891034 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:35.891011 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:35.893735 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:35.893705 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-ft7bz\"" Apr 22 16:08:35.901149 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:35.901123 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6"] Apr 22 16:08:35.950648 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:35.950611 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/23455089-f91a-4266-8b97-86fb67ed9d63-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:35.950648 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:35.950649 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/23455089-f91a-4266-8b97-86fb67ed9d63-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:35.950908 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:35.950678 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/23455089-f91a-4266-8b97-86fb67ed9d63-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:35.950908 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:35.950759 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/23455089-f91a-4266-8b97-86fb67ed9d63-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:35.950908 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:35.950849 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/23455089-f91a-4266-8b97-86fb67ed9d63-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:35.950908 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:35.950898 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/23455089-f91a-4266-8b97-86fb67ed9d63-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:35.951054 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:35.950943 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5pvp\" (UniqueName: \"kubernetes.io/projected/23455089-f91a-4266-8b97-86fb67ed9d63-kube-api-access-p5pvp\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:35.951054 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:35.950966 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/23455089-f91a-4266-8b97-86fb67ed9d63-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:35.951054 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:35.951019 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/23455089-f91a-4266-8b97-86fb67ed9d63-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:36.051447 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.051398 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/23455089-f91a-4266-8b97-86fb67ed9d63-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:36.051447 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.051454 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/23455089-f91a-4266-8b97-86fb67ed9d63-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:36.051716 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.051480 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/23455089-f91a-4266-8b97-86fb67ed9d63-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:36.051716 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.051523 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/23455089-f91a-4266-8b97-86fb67ed9d63-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:36.051716 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.051564 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/23455089-f91a-4266-8b97-86fb67ed9d63-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:36.051716 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.051607 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/23455089-f91a-4266-8b97-86fb67ed9d63-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:36.051716 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.051648 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/23455089-f91a-4266-8b97-86fb67ed9d63-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:36.051716 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.051678 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5pvp\" (UniqueName: \"kubernetes.io/projected/23455089-f91a-4266-8b97-86fb67ed9d63-kube-api-access-p5pvp\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:36.051716 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.051701 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/23455089-f91a-4266-8b97-86fb67ed9d63-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:36.052030 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.051889 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/23455089-f91a-4266-8b97-86fb67ed9d63-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:36.052030 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.051916 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/23455089-f91a-4266-8b97-86fb67ed9d63-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:36.052172 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.052153 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/23455089-f91a-4266-8b97-86fb67ed9d63-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:36.052249 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.052217 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/23455089-f91a-4266-8b97-86fb67ed9d63-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:36.052386 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.052364 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/23455089-f91a-4266-8b97-86fb67ed9d63-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:36.053980 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.053952 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/23455089-f91a-4266-8b97-86fb67ed9d63-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:36.054325 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.054289 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/23455089-f91a-4266-8b97-86fb67ed9d63-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:36.060789 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.060762 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/23455089-f91a-4266-8b97-86fb67ed9d63-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:36.061032 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.061009 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5pvp\" (UniqueName: \"kubernetes.io/projected/23455089-f91a-4266-8b97-86fb67ed9d63-kube-api-access-p5pvp\") pod \"maas-default-gateway-openshift-default-58b6f876-sbvm6\" (UID: \"23455089-f91a-4266-8b97-86fb67ed9d63\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:36.205368 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.205278 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:36.331745 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.331717 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6"] Apr 22 16:08:36.333879 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:08:36.333849 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23455089_f91a_4266_8b97_86fb67ed9d63.slice/crio-f4e9070a0be3d6e161c76211220ae9ff107c406d20b0ba29da52c416fc393765 WatchSource:0}: Error finding container f4e9070a0be3d6e161c76211220ae9ff107c406d20b0ba29da52c416fc393765: Status 404 returned error can't find the container with id f4e9070a0be3d6e161c76211220ae9ff107c406d20b0ba29da52c416fc393765 Apr 22 16:08:36.336405 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.336374 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 22 16:08:36.336489 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.336444 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 22 16:08:36.336489 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.336477 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 22 16:08:36.553012 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.552933 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" event={"ID":"23455089-f91a-4266-8b97-86fb67ed9d63","Type":"ContainerStarted","Data":"80fcef7d44255ba37d2ca8d7dbed86a1ef73fec43766ca6d713213494b773f11"} Apr 22 16:08:36.553012 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.552968 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" event={"ID":"23455089-f91a-4266-8b97-86fb67ed9d63","Type":"ContainerStarted","Data":"f4e9070a0be3d6e161c76211220ae9ff107c406d20b0ba29da52c416fc393765"} Apr 22 16:08:36.569296 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:36.569238 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" podStartSLOduration=1.569219124 podStartE2EDuration="1.569219124s" podCreationTimestamp="2026-04-22 16:08:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:08:36.568608896 +0000 UTC m=+597.468162970" watchObservedRunningTime="2026-04-22 16:08:36.569219124 +0000 UTC m=+597.468773188" Apr 22 16:08:37.205707 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:37.205660 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:37.210713 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:37.210684 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:37.557304 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:37.557216 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:37.558380 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:37.558360 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sbvm6" Apr 22 16:08:39.547726 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:39.547646 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxznf_fd3073fe-435c-4974-821b-9229018bf5f4/ovn-acl-logging/0.log" Apr 22 16:08:39.548980 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:39.548958 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxznf_fd3073fe-435c-4974-821b-9229018bf5f4/ovn-acl-logging/0.log" Apr 22 16:08:49.591160 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:49.591123 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-x8qpx"] Apr 22 16:08:49.599323 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:49.599293 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-x8qpx" Apr 22 16:08:49.600624 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:49.600596 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-x8qpx"] Apr 22 16:08:49.602067 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:49.602043 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-ds5xl\"" Apr 22 16:08:49.661894 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:49.661853 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xql4f\" (UniqueName: \"kubernetes.io/projected/97040a04-0963-4859-8dd7-acf62bb36c85-kube-api-access-xql4f\") pod \"authorino-f99f4b5cd-x8qpx\" (UID: \"97040a04-0963-4859-8dd7-acf62bb36c85\") " pod="kuadrant-system/authorino-f99f4b5cd-x8qpx" Apr 22 16:08:49.713418 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:49.713388 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-qngws"] Apr 22 16:08:49.715509 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:49.715492 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-qngws" Apr 22 16:08:49.723507 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:49.723483 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-qngws"] Apr 22 16:08:49.762652 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:49.762609 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6bnz\" (UniqueName: \"kubernetes.io/projected/e204f648-e4d5-4969-aa37-2acf94284698-kube-api-access-f6bnz\") pod \"authorino-7498df8756-qngws\" (UID: \"e204f648-e4d5-4969-aa37-2acf94284698\") " pod="kuadrant-system/authorino-7498df8756-qngws" Apr 22 16:08:49.762803 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:49.762669 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xql4f\" (UniqueName: \"kubernetes.io/projected/97040a04-0963-4859-8dd7-acf62bb36c85-kube-api-access-xql4f\") pod \"authorino-f99f4b5cd-x8qpx\" (UID: \"97040a04-0963-4859-8dd7-acf62bb36c85\") " pod="kuadrant-system/authorino-f99f4b5cd-x8qpx" Apr 22 16:08:49.770483 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:49.770450 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xql4f\" (UniqueName: \"kubernetes.io/projected/97040a04-0963-4859-8dd7-acf62bb36c85-kube-api-access-xql4f\") pod \"authorino-f99f4b5cd-x8qpx\" (UID: \"97040a04-0963-4859-8dd7-acf62bb36c85\") " pod="kuadrant-system/authorino-f99f4b5cd-x8qpx" Apr 22 16:08:49.864046 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:49.863957 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6bnz\" (UniqueName: \"kubernetes.io/projected/e204f648-e4d5-4969-aa37-2acf94284698-kube-api-access-f6bnz\") pod \"authorino-7498df8756-qngws\" (UID: \"e204f648-e4d5-4969-aa37-2acf94284698\") " pod="kuadrant-system/authorino-7498df8756-qngws" Apr 22 16:08:49.871340 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:49.871315 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6bnz\" (UniqueName: \"kubernetes.io/projected/e204f648-e4d5-4969-aa37-2acf94284698-kube-api-access-f6bnz\") pod \"authorino-7498df8756-qngws\" (UID: \"e204f648-e4d5-4969-aa37-2acf94284698\") " pod="kuadrant-system/authorino-7498df8756-qngws" Apr 22 16:08:49.912216 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:49.912151 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-x8qpx" Apr 22 16:08:50.025642 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:50.025610 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-qngws" Apr 22 16:08:50.038306 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:50.038262 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-x8qpx"] Apr 22 16:08:50.039849 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:08:50.039819 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97040a04_0963_4859_8dd7_acf62bb36c85.slice/crio-18977631f90957a933e004f990055d3bbbca0c4827e5653155bb7aee5e1e46b1 WatchSource:0}: Error finding container 18977631f90957a933e004f990055d3bbbca0c4827e5653155bb7aee5e1e46b1: Status 404 returned error can't find the container with id 18977631f90957a933e004f990055d3bbbca0c4827e5653155bb7aee5e1e46b1 Apr 22 16:08:50.154941 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:50.154916 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-qngws"] Apr 22 16:08:50.157209 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:08:50.157168 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode204f648_e4d5_4969_aa37_2acf94284698.slice/crio-d78d0463c7feccd26e632e4b7cb26a23117336c60d680d86ea6e7d81c0da924c WatchSource:0}: Error finding container d78d0463c7feccd26e632e4b7cb26a23117336c60d680d86ea6e7d81c0da924c: Status 404 returned error can't find the container with id d78d0463c7feccd26e632e4b7cb26a23117336c60d680d86ea6e7d81c0da924c Apr 22 16:08:50.605399 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:50.605364 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-qngws" event={"ID":"e204f648-e4d5-4969-aa37-2acf94284698","Type":"ContainerStarted","Data":"d78d0463c7feccd26e632e4b7cb26a23117336c60d680d86ea6e7d81c0da924c"} Apr 22 16:08:50.606508 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:50.606481 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-x8qpx" event={"ID":"97040a04-0963-4859-8dd7-acf62bb36c85","Type":"ContainerStarted","Data":"18977631f90957a933e004f990055d3bbbca0c4827e5653155bb7aee5e1e46b1"} Apr 22 16:08:54.634370 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:54.634332 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-qngws" event={"ID":"e204f648-e4d5-4969-aa37-2acf94284698","Type":"ContainerStarted","Data":"7c4ca0d26336f8b0bfb8afa67259e976d8cd6e4347e80637d6b3fdb6758316c6"} Apr 22 16:08:54.635688 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:54.635659 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-x8qpx" event={"ID":"97040a04-0963-4859-8dd7-acf62bb36c85","Type":"ContainerStarted","Data":"dd6d0b9903183eea0f46cd10327c00224d3641d13f59f8901a77cf5352b3986d"} Apr 22 16:08:54.647849 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:54.647798 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-qngws" podStartSLOduration=2.087232709 podStartE2EDuration="5.647782132s" podCreationTimestamp="2026-04-22 16:08:49 +0000 UTC" firstStartedPulling="2026-04-22 16:08:50.158536135 +0000 UTC m=+611.058090187" lastFinishedPulling="2026-04-22 16:08:53.71908556 +0000 UTC m=+614.618639610" observedRunningTime="2026-04-22 16:08:54.646523179 +0000 UTC m=+615.546077263" watchObservedRunningTime="2026-04-22 16:08:54.647782132 +0000 UTC m=+615.547336204" Apr 22 16:08:54.659814 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:54.659767 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-x8qpx" podStartSLOduration=1.9866179179999999 podStartE2EDuration="5.659751732s" podCreationTimestamp="2026-04-22 16:08:49 +0000 UTC" firstStartedPulling="2026-04-22 16:08:50.041119466 +0000 UTC m=+610.940673515" lastFinishedPulling="2026-04-22 16:08:53.71425328 +0000 UTC m=+614.613807329" observedRunningTime="2026-04-22 16:08:54.658366705 +0000 UTC m=+615.557920776" watchObservedRunningTime="2026-04-22 16:08:54.659751732 +0000 UTC m=+615.559305803" Apr 22 16:08:54.680170 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:54.680128 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-x8qpx"] Apr 22 16:08:56.642423 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:56.642321 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-x8qpx" podUID="97040a04-0963-4859-8dd7-acf62bb36c85" containerName="authorino" containerID="cri-o://dd6d0b9903183eea0f46cd10327c00224d3641d13f59f8901a77cf5352b3986d" gracePeriod=30 Apr 22 16:08:56.880520 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:56.880490 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-x8qpx" Apr 22 16:08:57.030543 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:57.030444 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xql4f\" (UniqueName: \"kubernetes.io/projected/97040a04-0963-4859-8dd7-acf62bb36c85-kube-api-access-xql4f\") pod \"97040a04-0963-4859-8dd7-acf62bb36c85\" (UID: \"97040a04-0963-4859-8dd7-acf62bb36c85\") " Apr 22 16:08:57.032718 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:57.032683 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97040a04-0963-4859-8dd7-acf62bb36c85-kube-api-access-xql4f" (OuterVolumeSpecName: "kube-api-access-xql4f") pod "97040a04-0963-4859-8dd7-acf62bb36c85" (UID: "97040a04-0963-4859-8dd7-acf62bb36c85"). InnerVolumeSpecName "kube-api-access-xql4f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:08:57.131465 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:57.131426 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xql4f\" (UniqueName: \"kubernetes.io/projected/97040a04-0963-4859-8dd7-acf62bb36c85-kube-api-access-xql4f\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:08:57.647035 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:57.646997 2572 generic.go:358] "Generic (PLEG): container finished" podID="97040a04-0963-4859-8dd7-acf62bb36c85" containerID="dd6d0b9903183eea0f46cd10327c00224d3641d13f59f8901a77cf5352b3986d" exitCode=0 Apr 22 16:08:57.647580 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:57.647051 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-x8qpx" Apr 22 16:08:57.647580 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:57.647082 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-x8qpx" event={"ID":"97040a04-0963-4859-8dd7-acf62bb36c85","Type":"ContainerDied","Data":"dd6d0b9903183eea0f46cd10327c00224d3641d13f59f8901a77cf5352b3986d"} Apr 22 16:08:57.647580 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:57.647120 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-x8qpx" event={"ID":"97040a04-0963-4859-8dd7-acf62bb36c85","Type":"ContainerDied","Data":"18977631f90957a933e004f990055d3bbbca0c4827e5653155bb7aee5e1e46b1"} Apr 22 16:08:57.647580 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:57.647141 2572 scope.go:117] "RemoveContainer" containerID="dd6d0b9903183eea0f46cd10327c00224d3641d13f59f8901a77cf5352b3986d" Apr 22 16:08:57.655696 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:57.655673 2572 scope.go:117] "RemoveContainer" containerID="dd6d0b9903183eea0f46cd10327c00224d3641d13f59f8901a77cf5352b3986d" Apr 22 16:08:57.655982 ip-10-0-135-9 kubenswrapper[2572]: E0422 16:08:57.655961 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd6d0b9903183eea0f46cd10327c00224d3641d13f59f8901a77cf5352b3986d\": container with ID starting with dd6d0b9903183eea0f46cd10327c00224d3641d13f59f8901a77cf5352b3986d not found: ID does not exist" containerID="dd6d0b9903183eea0f46cd10327c00224d3641d13f59f8901a77cf5352b3986d" Apr 22 16:08:57.656045 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:57.655991 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd6d0b9903183eea0f46cd10327c00224d3641d13f59f8901a77cf5352b3986d"} err="failed to get container status \"dd6d0b9903183eea0f46cd10327c00224d3641d13f59f8901a77cf5352b3986d\": rpc error: code = NotFound desc = could not find container \"dd6d0b9903183eea0f46cd10327c00224d3641d13f59f8901a77cf5352b3986d\": container with ID starting with dd6d0b9903183eea0f46cd10327c00224d3641d13f59f8901a77cf5352b3986d not found: ID does not exist" Apr 22 16:08:57.668241 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:57.668209 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-x8qpx"] Apr 22 16:08:57.670095 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:57.670070 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-x8qpx"] Apr 22 16:08:59.649922 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:08:59.649889 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97040a04-0963-4859-8dd7-acf62bb36c85" path="/var/lib/kubelet/pods/97040a04-0963-4859-8dd7-acf62bb36c85/volumes" Apr 22 16:09:18.430690 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:18.430652 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-rlsqk"] Apr 22 16:09:18.433267 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:18.431073 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97040a04-0963-4859-8dd7-acf62bb36c85" containerName="authorino" Apr 22 16:09:18.433267 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:18.431090 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="97040a04-0963-4859-8dd7-acf62bb36c85" containerName="authorino" Apr 22 16:09:18.433267 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:18.431161 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="97040a04-0963-4859-8dd7-acf62bb36c85" containerName="authorino" Apr 22 16:09:18.434158 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:18.434142 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-rlsqk" Apr 22 16:09:18.440137 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:18.440113 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-rlsqk"] Apr 22 16:09:18.516065 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:18.516032 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szdns\" (UniqueName: \"kubernetes.io/projected/f5a00a32-c4e0-49f4-804a-1a4c2b52b670-kube-api-access-szdns\") pod \"authorino-8b475cf9f-rlsqk\" (UID: \"f5a00a32-c4e0-49f4-804a-1a4c2b52b670\") " pod="kuadrant-system/authorino-8b475cf9f-rlsqk" Apr 22 16:09:18.617127 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:18.617095 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szdns\" (UniqueName: \"kubernetes.io/projected/f5a00a32-c4e0-49f4-804a-1a4c2b52b670-kube-api-access-szdns\") pod \"authorino-8b475cf9f-rlsqk\" (UID: \"f5a00a32-c4e0-49f4-804a-1a4c2b52b670\") " pod="kuadrant-system/authorino-8b475cf9f-rlsqk" Apr 22 16:09:18.625340 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:18.625304 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szdns\" (UniqueName: \"kubernetes.io/projected/f5a00a32-c4e0-49f4-804a-1a4c2b52b670-kube-api-access-szdns\") pod \"authorino-8b475cf9f-rlsqk\" (UID: \"f5a00a32-c4e0-49f4-804a-1a4c2b52b670\") " pod="kuadrant-system/authorino-8b475cf9f-rlsqk" Apr 22 16:09:18.645055 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:18.645024 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-rlsqk"] Apr 22 16:09:18.645290 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:18.645278 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-rlsqk" Apr 22 16:09:18.672644 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:18.672597 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-65c8c56bf9-tr7pp"] Apr 22 16:09:18.677673 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:18.677645 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-65c8c56bf9-tr7pp" Apr 22 16:09:18.680459 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:18.680432 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-65c8c56bf9-tr7pp"] Apr 22 16:09:18.720330 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:18.718526 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnkng\" (UniqueName: \"kubernetes.io/projected/f2b58d68-c0c0-4389-b76a-5c6187f8f9c7-kube-api-access-lnkng\") pod \"authorino-65c8c56bf9-tr7pp\" (UID: \"f2b58d68-c0c0-4389-b76a-5c6187f8f9c7\") " pod="kuadrant-system/authorino-65c8c56bf9-tr7pp" Apr 22 16:09:18.779216 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:18.779154 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-rlsqk"] Apr 22 16:09:18.783065 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:09:18.783038 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5a00a32_c4e0_49f4_804a_1a4c2b52b670.slice/crio-e99f8e5b1b76175c72ca29f0305b7b889cd41e2f8a0e3d51cf3de3a478ffdd09 WatchSource:0}: Error finding container e99f8e5b1b76175c72ca29f0305b7b889cd41e2f8a0e3d51cf3de3a478ffdd09: Status 404 returned error can't find the container with id e99f8e5b1b76175c72ca29f0305b7b889cd41e2f8a0e3d51cf3de3a478ffdd09 Apr 22 16:09:18.819817 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:18.819780 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lnkng\" (UniqueName: \"kubernetes.io/projected/f2b58d68-c0c0-4389-b76a-5c6187f8f9c7-kube-api-access-lnkng\") pod \"authorino-65c8c56bf9-tr7pp\" (UID: \"f2b58d68-c0c0-4389-b76a-5c6187f8f9c7\") " pod="kuadrant-system/authorino-65c8c56bf9-tr7pp" Apr 22 16:09:18.827461 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:18.827434 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnkng\" (UniqueName: \"kubernetes.io/projected/f2b58d68-c0c0-4389-b76a-5c6187f8f9c7-kube-api-access-lnkng\") pod \"authorino-65c8c56bf9-tr7pp\" (UID: \"f2b58d68-c0c0-4389-b76a-5c6187f8f9c7\") " pod="kuadrant-system/authorino-65c8c56bf9-tr7pp" Apr 22 16:09:18.959826 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:18.959728 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-65c8c56bf9-tr7pp"] Apr 22 16:09:18.960031 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:18.960014 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-65c8c56bf9-tr7pp" Apr 22 16:09:19.000637 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:19.000605 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-69c6ffbd8c-f8mhh"] Apr 22 16:09:19.006693 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:19.006663 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-69c6ffbd8c-f8mhh" Apr 22 16:09:19.011265 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:19.011240 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 22 16:09:19.025013 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:19.024983 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-69c6ffbd8c-f8mhh"] Apr 22 16:09:19.092097 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:19.092058 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-65c8c56bf9-tr7pp"] Apr 22 16:09:19.095574 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:09:19.095540 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2b58d68_c0c0_4389_b76a_5c6187f8f9c7.slice/crio-178402dc6dcaab836f37f38e43cf8b6c181eda85b28faa21154932c5cb25dd5c WatchSource:0}: Error finding container 178402dc6dcaab836f37f38e43cf8b6c181eda85b28faa21154932c5cb25dd5c: Status 404 returned error can't find the container with id 178402dc6dcaab836f37f38e43cf8b6c181eda85b28faa21154932c5cb25dd5c Apr 22 16:09:19.121979 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:19.121945 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/613dbe12-2e1c-43b9-86bf-f3064082cd7c-tls-cert\") pod \"authorino-69c6ffbd8c-f8mhh\" (UID: \"613dbe12-2e1c-43b9-86bf-f3064082cd7c\") " pod="kuadrant-system/authorino-69c6ffbd8c-f8mhh" Apr 22 16:09:19.122268 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:19.122240 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmdf8\" (UniqueName: \"kubernetes.io/projected/613dbe12-2e1c-43b9-86bf-f3064082cd7c-kube-api-access-fmdf8\") pod \"authorino-69c6ffbd8c-f8mhh\" (UID: \"613dbe12-2e1c-43b9-86bf-f3064082cd7c\") " pod="kuadrant-system/authorino-69c6ffbd8c-f8mhh" Apr 22 16:09:19.223313 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:19.223210 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmdf8\" (UniqueName: \"kubernetes.io/projected/613dbe12-2e1c-43b9-86bf-f3064082cd7c-kube-api-access-fmdf8\") pod \"authorino-69c6ffbd8c-f8mhh\" (UID: \"613dbe12-2e1c-43b9-86bf-f3064082cd7c\") " pod="kuadrant-system/authorino-69c6ffbd8c-f8mhh" Apr 22 16:09:19.223313 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:19.223271 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/613dbe12-2e1c-43b9-86bf-f3064082cd7c-tls-cert\") pod \"authorino-69c6ffbd8c-f8mhh\" (UID: \"613dbe12-2e1c-43b9-86bf-f3064082cd7c\") " pod="kuadrant-system/authorino-69c6ffbd8c-f8mhh" Apr 22 16:09:19.226324 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:19.226292 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/613dbe12-2e1c-43b9-86bf-f3064082cd7c-tls-cert\") pod \"authorino-69c6ffbd8c-f8mhh\" (UID: \"613dbe12-2e1c-43b9-86bf-f3064082cd7c\") " pod="kuadrant-system/authorino-69c6ffbd8c-f8mhh" Apr 22 16:09:19.231039 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:19.231014 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmdf8\" (UniqueName: \"kubernetes.io/projected/613dbe12-2e1c-43b9-86bf-f3064082cd7c-kube-api-access-fmdf8\") pod \"authorino-69c6ffbd8c-f8mhh\" (UID: \"613dbe12-2e1c-43b9-86bf-f3064082cd7c\") " pod="kuadrant-system/authorino-69c6ffbd8c-f8mhh" Apr 22 16:09:19.324737 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:19.324694 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-69c6ffbd8c-f8mhh" Apr 22 16:09:19.452097 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:19.452068 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-69c6ffbd8c-f8mhh"] Apr 22 16:09:19.454327 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:09:19.454290 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod613dbe12_2e1c_43b9_86bf_f3064082cd7c.slice/crio-88d2484d67ebb0d22612cf6a820800ad51d8cec5d14777cb5c588727a159a101 WatchSource:0}: Error finding container 88d2484d67ebb0d22612cf6a820800ad51d8cec5d14777cb5c588727a159a101: Status 404 returned error can't find the container with id 88d2484d67ebb0d22612cf6a820800ad51d8cec5d14777cb5c588727a159a101 Apr 22 16:09:19.733807 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:19.733771 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-65c8c56bf9-tr7pp" event={"ID":"f2b58d68-c0c0-4389-b76a-5c6187f8f9c7","Type":"ContainerStarted","Data":"76c8edf5870abf0b2c017c78d4416791b6eadec1d0c2f1fca1da9e2d91f599f5"} Apr 22 16:09:19.734401 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:19.733817 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-65c8c56bf9-tr7pp" event={"ID":"f2b58d68-c0c0-4389-b76a-5c6187f8f9c7","Type":"ContainerStarted","Data":"178402dc6dcaab836f37f38e43cf8b6c181eda85b28faa21154932c5cb25dd5c"} Apr 22 16:09:19.734401 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:19.733856 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-65c8c56bf9-tr7pp" podUID="f2b58d68-c0c0-4389-b76a-5c6187f8f9c7" containerName="authorino" containerID="cri-o://76c8edf5870abf0b2c017c78d4416791b6eadec1d0c2f1fca1da9e2d91f599f5" gracePeriod=30 Apr 22 16:09:19.734865 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:19.734794 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-69c6ffbd8c-f8mhh" event={"ID":"613dbe12-2e1c-43b9-86bf-f3064082cd7c","Type":"ContainerStarted","Data":"88d2484d67ebb0d22612cf6a820800ad51d8cec5d14777cb5c588727a159a101"} Apr 22 16:09:19.736022 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:19.735998 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-rlsqk" event={"ID":"f5a00a32-c4e0-49f4-804a-1a4c2b52b670","Type":"ContainerStarted","Data":"66014199dd0e1563d587977244f33a1ea8f323a916b2b61a09d50271526515b6"} Apr 22 16:09:19.736022 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:19.736022 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-rlsqk" event={"ID":"f5a00a32-c4e0-49f4-804a-1a4c2b52b670","Type":"ContainerStarted","Data":"e99f8e5b1b76175c72ca29f0305b7b889cd41e2f8a0e3d51cf3de3a478ffdd09"} Apr 22 16:09:19.736186 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:19.736064 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-rlsqk" podUID="f5a00a32-c4e0-49f4-804a-1a4c2b52b670" containerName="authorino" containerID="cri-o://66014199dd0e1563d587977244f33a1ea8f323a916b2b61a09d50271526515b6" gracePeriod=30 Apr 22 16:09:19.747381 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:19.747327 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-65c8c56bf9-tr7pp" podStartSLOduration=1.359736011 podStartE2EDuration="1.747308669s" podCreationTimestamp="2026-04-22 16:09:18 +0000 UTC" firstStartedPulling="2026-04-22 16:09:19.096944903 +0000 UTC m=+639.996498952" lastFinishedPulling="2026-04-22 16:09:19.484517559 +0000 UTC m=+640.384071610" observedRunningTime="2026-04-22 16:09:19.746957733 +0000 UTC m=+640.646511804" watchObservedRunningTime="2026-04-22 16:09:19.747308669 +0000 UTC m=+640.646862741" Apr 22 16:09:19.759743 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:19.759686 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-rlsqk" podStartSLOduration=1.419336941 podStartE2EDuration="1.759667036s" podCreationTimestamp="2026-04-22 16:09:18 +0000 UTC" firstStartedPulling="2026-04-22 16:09:18.7843479 +0000 UTC m=+639.683901952" lastFinishedPulling="2026-04-22 16:09:19.124677998 +0000 UTC m=+640.024232047" observedRunningTime="2026-04-22 16:09:19.75911581 +0000 UTC m=+640.658669881" watchObservedRunningTime="2026-04-22 16:09:19.759667036 +0000 UTC m=+640.659221135" Apr 22 16:09:20.031704 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.031681 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-rlsqk" Apr 22 16:09:20.035296 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.035275 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-65c8c56bf9-tr7pp" Apr 22 16:09:20.132397 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.132365 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szdns\" (UniqueName: \"kubernetes.io/projected/f5a00a32-c4e0-49f4-804a-1a4c2b52b670-kube-api-access-szdns\") pod \"f5a00a32-c4e0-49f4-804a-1a4c2b52b670\" (UID: \"f5a00a32-c4e0-49f4-804a-1a4c2b52b670\") " Apr 22 16:09:20.132397 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.132412 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnkng\" (UniqueName: \"kubernetes.io/projected/f2b58d68-c0c0-4389-b76a-5c6187f8f9c7-kube-api-access-lnkng\") pod \"f2b58d68-c0c0-4389-b76a-5c6187f8f9c7\" (UID: \"f2b58d68-c0c0-4389-b76a-5c6187f8f9c7\") " Apr 22 16:09:20.134611 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.134578 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5a00a32-c4e0-49f4-804a-1a4c2b52b670-kube-api-access-szdns" (OuterVolumeSpecName: "kube-api-access-szdns") pod "f5a00a32-c4e0-49f4-804a-1a4c2b52b670" (UID: "f5a00a32-c4e0-49f4-804a-1a4c2b52b670"). InnerVolumeSpecName "kube-api-access-szdns". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:09:20.134724 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.134623 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2b58d68-c0c0-4389-b76a-5c6187f8f9c7-kube-api-access-lnkng" (OuterVolumeSpecName: "kube-api-access-lnkng") pod "f2b58d68-c0c0-4389-b76a-5c6187f8f9c7" (UID: "f2b58d68-c0c0-4389-b76a-5c6187f8f9c7"). InnerVolumeSpecName "kube-api-access-lnkng". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:09:20.233495 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.233455 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-szdns\" (UniqueName: \"kubernetes.io/projected/f5a00a32-c4e0-49f4-804a-1a4c2b52b670-kube-api-access-szdns\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:09:20.233495 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.233487 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lnkng\" (UniqueName: \"kubernetes.io/projected/f2b58d68-c0c0-4389-b76a-5c6187f8f9c7-kube-api-access-lnkng\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:09:20.741840 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.741807 2572 generic.go:358] "Generic (PLEG): container finished" podID="f2b58d68-c0c0-4389-b76a-5c6187f8f9c7" containerID="76c8edf5870abf0b2c017c78d4416791b6eadec1d0c2f1fca1da9e2d91f599f5" exitCode=0 Apr 22 16:09:20.742310 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.741865 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-65c8c56bf9-tr7pp" Apr 22 16:09:20.742310 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.741889 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-65c8c56bf9-tr7pp" event={"ID":"f2b58d68-c0c0-4389-b76a-5c6187f8f9c7","Type":"ContainerDied","Data":"76c8edf5870abf0b2c017c78d4416791b6eadec1d0c2f1fca1da9e2d91f599f5"} Apr 22 16:09:20.742310 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.741920 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-65c8c56bf9-tr7pp" event={"ID":"f2b58d68-c0c0-4389-b76a-5c6187f8f9c7","Type":"ContainerDied","Data":"178402dc6dcaab836f37f38e43cf8b6c181eda85b28faa21154932c5cb25dd5c"} Apr 22 16:09:20.742310 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.741936 2572 scope.go:117] "RemoveContainer" containerID="76c8edf5870abf0b2c017c78d4416791b6eadec1d0c2f1fca1da9e2d91f599f5" Apr 22 16:09:20.743471 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.743451 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-69c6ffbd8c-f8mhh" event={"ID":"613dbe12-2e1c-43b9-86bf-f3064082cd7c","Type":"ContainerStarted","Data":"e7319ea5589559f96779d0dd332f04be52f8c3be579eeec590b0b391a757231a"} Apr 22 16:09:20.744528 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.744508 2572 generic.go:358] "Generic (PLEG): container finished" podID="f5a00a32-c4e0-49f4-804a-1a4c2b52b670" containerID="66014199dd0e1563d587977244f33a1ea8f323a916b2b61a09d50271526515b6" exitCode=0 Apr 22 16:09:20.744628 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.744568 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-rlsqk" event={"ID":"f5a00a32-c4e0-49f4-804a-1a4c2b52b670","Type":"ContainerDied","Data":"66014199dd0e1563d587977244f33a1ea8f323a916b2b61a09d50271526515b6"} Apr 22 16:09:20.744628 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.744590 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-rlsqk" event={"ID":"f5a00a32-c4e0-49f4-804a-1a4c2b52b670","Type":"ContainerDied","Data":"e99f8e5b1b76175c72ca29f0305b7b889cd41e2f8a0e3d51cf3de3a478ffdd09"} Apr 22 16:09:20.744628 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.744615 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-rlsqk" Apr 22 16:09:20.751785 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.751767 2572 scope.go:117] "RemoveContainer" containerID="76c8edf5870abf0b2c017c78d4416791b6eadec1d0c2f1fca1da9e2d91f599f5" Apr 22 16:09:20.752053 ip-10-0-135-9 kubenswrapper[2572]: E0422 16:09:20.752027 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76c8edf5870abf0b2c017c78d4416791b6eadec1d0c2f1fca1da9e2d91f599f5\": container with ID starting with 76c8edf5870abf0b2c017c78d4416791b6eadec1d0c2f1fca1da9e2d91f599f5 not found: ID does not exist" containerID="76c8edf5870abf0b2c017c78d4416791b6eadec1d0c2f1fca1da9e2d91f599f5" Apr 22 16:09:20.752053 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.752058 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76c8edf5870abf0b2c017c78d4416791b6eadec1d0c2f1fca1da9e2d91f599f5"} err="failed to get container status \"76c8edf5870abf0b2c017c78d4416791b6eadec1d0c2f1fca1da9e2d91f599f5\": rpc error: code = NotFound desc = could not find container \"76c8edf5870abf0b2c017c78d4416791b6eadec1d0c2f1fca1da9e2d91f599f5\": container with ID starting with 76c8edf5870abf0b2c017c78d4416791b6eadec1d0c2f1fca1da9e2d91f599f5 not found: ID does not exist" Apr 22 16:09:20.752254 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.752076 2572 scope.go:117] "RemoveContainer" containerID="66014199dd0e1563d587977244f33a1ea8f323a916b2b61a09d50271526515b6" Apr 22 16:09:20.760887 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.760781 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-69c6ffbd8c-f8mhh" podStartSLOduration=2.181180943 podStartE2EDuration="2.760765219s" podCreationTimestamp="2026-04-22 16:09:18 +0000 UTC" firstStartedPulling="2026-04-22 16:09:19.455676092 +0000 UTC m=+640.355230141" lastFinishedPulling="2026-04-22 16:09:20.035260354 +0000 UTC m=+640.934814417" observedRunningTime="2026-04-22 16:09:20.758827856 +0000 UTC m=+641.658381927" watchObservedRunningTime="2026-04-22 16:09:20.760765219 +0000 UTC m=+641.660319292" Apr 22 16:09:20.760977 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.760927 2572 scope.go:117] "RemoveContainer" containerID="66014199dd0e1563d587977244f33a1ea8f323a916b2b61a09d50271526515b6" Apr 22 16:09:20.761258 ip-10-0-135-9 kubenswrapper[2572]: E0422 16:09:20.761234 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66014199dd0e1563d587977244f33a1ea8f323a916b2b61a09d50271526515b6\": container with ID starting with 66014199dd0e1563d587977244f33a1ea8f323a916b2b61a09d50271526515b6 not found: ID does not exist" containerID="66014199dd0e1563d587977244f33a1ea8f323a916b2b61a09d50271526515b6" Apr 22 16:09:20.761335 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.761265 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66014199dd0e1563d587977244f33a1ea8f323a916b2b61a09d50271526515b6"} err="failed to get container status \"66014199dd0e1563d587977244f33a1ea8f323a916b2b61a09d50271526515b6\": rpc error: code = NotFound desc = could not find container \"66014199dd0e1563d587977244f33a1ea8f323a916b2b61a09d50271526515b6\": container with ID starting with 66014199dd0e1563d587977244f33a1ea8f323a916b2b61a09d50271526515b6 not found: ID does not exist" Apr 22 16:09:20.776697 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.776664 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-rlsqk"] Apr 22 16:09:20.780665 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.780635 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-rlsqk"] Apr 22 16:09:20.785055 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.785029 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-qngws"] Apr 22 16:09:20.785275 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.785254 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-qngws" podUID="e204f648-e4d5-4969-aa37-2acf94284698" containerName="authorino" containerID="cri-o://7c4ca0d26336f8b0bfb8afa67259e976d8cd6e4347e80637d6b3fdb6758316c6" gracePeriod=30 Apr 22 16:09:20.798554 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.798521 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-65c8c56bf9-tr7pp"] Apr 22 16:09:20.804102 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:20.804066 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-65c8c56bf9-tr7pp"] Apr 22 16:09:21.041053 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:21.041024 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-qngws" Apr 22 16:09:21.240459 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:21.240369 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6bnz\" (UniqueName: \"kubernetes.io/projected/e204f648-e4d5-4969-aa37-2acf94284698-kube-api-access-f6bnz\") pod \"e204f648-e4d5-4969-aa37-2acf94284698\" (UID: \"e204f648-e4d5-4969-aa37-2acf94284698\") " Apr 22 16:09:21.242604 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:21.242577 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e204f648-e4d5-4969-aa37-2acf94284698-kube-api-access-f6bnz" (OuterVolumeSpecName: "kube-api-access-f6bnz") pod "e204f648-e4d5-4969-aa37-2acf94284698" (UID: "e204f648-e4d5-4969-aa37-2acf94284698"). InnerVolumeSpecName "kube-api-access-f6bnz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:09:21.341670 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:21.341628 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f6bnz\" (UniqueName: \"kubernetes.io/projected/e204f648-e4d5-4969-aa37-2acf94284698-kube-api-access-f6bnz\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:09:21.644920 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:21.644884 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2b58d68-c0c0-4389-b76a-5c6187f8f9c7" path="/var/lib/kubelet/pods/f2b58d68-c0c0-4389-b76a-5c6187f8f9c7/volumes" Apr 22 16:09:21.645186 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:21.645174 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5a00a32-c4e0-49f4-804a-1a4c2b52b670" path="/var/lib/kubelet/pods/f5a00a32-c4e0-49f4-804a-1a4c2b52b670/volumes" Apr 22 16:09:21.750061 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:21.750024 2572 generic.go:358] "Generic (PLEG): container finished" podID="e204f648-e4d5-4969-aa37-2acf94284698" containerID="7c4ca0d26336f8b0bfb8afa67259e976d8cd6e4347e80637d6b3fdb6758316c6" exitCode=0 Apr 22 16:09:21.750518 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:21.750073 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-qngws" Apr 22 16:09:21.750518 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:21.750095 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-qngws" event={"ID":"e204f648-e4d5-4969-aa37-2acf94284698","Type":"ContainerDied","Data":"7c4ca0d26336f8b0bfb8afa67259e976d8cd6e4347e80637d6b3fdb6758316c6"} Apr 22 16:09:21.750518 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:21.750128 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-qngws" event={"ID":"e204f648-e4d5-4969-aa37-2acf94284698","Type":"ContainerDied","Data":"d78d0463c7feccd26e632e4b7cb26a23117336c60d680d86ea6e7d81c0da924c"} Apr 22 16:09:21.750518 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:21.750146 2572 scope.go:117] "RemoveContainer" containerID="7c4ca0d26336f8b0bfb8afa67259e976d8cd6e4347e80637d6b3fdb6758316c6" Apr 22 16:09:21.758417 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:21.758398 2572 scope.go:117] "RemoveContainer" containerID="7c4ca0d26336f8b0bfb8afa67259e976d8cd6e4347e80637d6b3fdb6758316c6" Apr 22 16:09:21.758698 ip-10-0-135-9 kubenswrapper[2572]: E0422 16:09:21.758680 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c4ca0d26336f8b0bfb8afa67259e976d8cd6e4347e80637d6b3fdb6758316c6\": container with ID starting with 7c4ca0d26336f8b0bfb8afa67259e976d8cd6e4347e80637d6b3fdb6758316c6 not found: ID does not exist" containerID="7c4ca0d26336f8b0bfb8afa67259e976d8cd6e4347e80637d6b3fdb6758316c6" Apr 22 16:09:21.758743 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:21.758708 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c4ca0d26336f8b0bfb8afa67259e976d8cd6e4347e80637d6b3fdb6758316c6"} err="failed to get container status \"7c4ca0d26336f8b0bfb8afa67259e976d8cd6e4347e80637d6b3fdb6758316c6\": rpc error: code = NotFound desc = could not find container \"7c4ca0d26336f8b0bfb8afa67259e976d8cd6e4347e80637d6b3fdb6758316c6\": container with ID starting with 7c4ca0d26336f8b0bfb8afa67259e976d8cd6e4347e80637d6b3fdb6758316c6 not found: ID does not exist" Apr 22 16:09:21.770298 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:21.770270 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-qngws"] Apr 22 16:09:21.774358 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:21.774333 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-qngws"] Apr 22 16:09:23.645613 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:09:23.645579 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e204f648-e4d5-4969-aa37-2acf94284698" path="/var/lib/kubelet/pods/e204f648-e4d5-4969-aa37-2acf94284698/volumes" Apr 22 16:10:07.262053 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.262017 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k"] Apr 22 16:10:07.262873 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.262389 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2b58d68-c0c0-4389-b76a-5c6187f8f9c7" containerName="authorino" Apr 22 16:10:07.262873 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.262407 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2b58d68-c0c0-4389-b76a-5c6187f8f9c7" containerName="authorino" Apr 22 16:10:07.262873 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.262425 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e204f648-e4d5-4969-aa37-2acf94284698" containerName="authorino" Apr 22 16:10:07.262873 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.262432 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e204f648-e4d5-4969-aa37-2acf94284698" containerName="authorino" Apr 22 16:10:07.262873 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.262450 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5a00a32-c4e0-49f4-804a-1a4c2b52b670" containerName="authorino" Apr 22 16:10:07.262873 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.262458 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5a00a32-c4e0-49f4-804a-1a4c2b52b670" containerName="authorino" Apr 22 16:10:07.262873 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.262561 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f5a00a32-c4e0-49f4-804a-1a4c2b52b670" containerName="authorino" Apr 22 16:10:07.262873 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.262576 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f2b58d68-c0c0-4389-b76a-5c6187f8f9c7" containerName="authorino" Apr 22 16:10:07.262873 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.262587 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e204f648-e4d5-4969-aa37-2acf94284698" containerName="authorino" Apr 22 16:10:07.265574 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.265558 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" Apr 22 16:10:07.268737 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.268713 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 22 16:10:07.268868 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.268713 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 22 16:10:07.268868 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.268749 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 22 16:10:07.268868 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.268713 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-k94wr\"" Apr 22 16:10:07.274934 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.274914 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k"] Apr 22 16:10:07.407502 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.407464 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k\" (UID: \"fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" Apr 22 16:10:07.407692 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.407566 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k\" (UID: \"fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" Apr 22 16:10:07.407692 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.407602 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k\" (UID: \"fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" Apr 22 16:10:07.407692 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.407649 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k\" (UID: \"fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" Apr 22 16:10:07.407692 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.407674 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8kwm\" (UniqueName: \"kubernetes.io/projected/fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15-kube-api-access-j8kwm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k\" (UID: \"fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" Apr 22 16:10:07.407852 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.407706 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k\" (UID: \"fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" Apr 22 16:10:07.508622 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.508585 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k\" (UID: \"fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" Apr 22 16:10:07.508622 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.508634 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k\" (UID: \"fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" Apr 22 16:10:07.508877 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.508690 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k\" (UID: \"fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" Apr 22 16:10:07.508877 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.508715 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k\" (UID: \"fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" Apr 22 16:10:07.508877 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.508734 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k\" (UID: \"fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" Apr 22 16:10:07.508877 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.508841 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8kwm\" (UniqueName: \"kubernetes.io/projected/fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15-kube-api-access-j8kwm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k\" (UID: \"fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" Apr 22 16:10:07.509133 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.509108 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k\" (UID: \"fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" Apr 22 16:10:07.509271 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.509150 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k\" (UID: \"fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" Apr 22 16:10:07.509271 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.509175 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k\" (UID: \"fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" Apr 22 16:10:07.511136 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.511112 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k\" (UID: \"fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" Apr 22 16:10:07.511379 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.511364 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k\" (UID: \"fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" Apr 22 16:10:07.517609 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.517548 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8kwm\" (UniqueName: \"kubernetes.io/projected/fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15-kube-api-access-j8kwm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k\" (UID: \"fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" Apr 22 16:10:07.577383 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.577347 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" Apr 22 16:10:07.706432 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.706406 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k"] Apr 22 16:10:07.708575 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:10:07.708546 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd7a9a3e_dc94_48f3_b9d2_a9b53d2f8a15.slice/crio-ec627a2596b2e260340fb8fcc8ca2270c6cd665fe64756ecb764f33711cd3688 WatchSource:0}: Error finding container ec627a2596b2e260340fb8fcc8ca2270c6cd665fe64756ecb764f33711cd3688: Status 404 returned error can't find the container with id ec627a2596b2e260340fb8fcc8ca2270c6cd665fe64756ecb764f33711cd3688 Apr 22 16:10:07.710353 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.710330 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 16:10:07.917242 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:07.917185 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" event={"ID":"fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15","Type":"ContainerStarted","Data":"ec627a2596b2e260340fb8fcc8ca2270c6cd665fe64756ecb764f33711cd3688"} Apr 22 16:10:15.951937 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:15.951898 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" event={"ID":"fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15","Type":"ContainerStarted","Data":"52e18e2e54dc457067daf2803adbf85851ab7afa730339d962a3c53246894589"} Apr 22 16:10:20.969867 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:20.969786 2572 generic.go:358] "Generic (PLEG): container finished" podID="fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15" containerID="52e18e2e54dc457067daf2803adbf85851ab7afa730339d962a3c53246894589" exitCode=0 Apr 22 16:10:20.970253 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:20.969863 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" event={"ID":"fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15","Type":"ContainerDied","Data":"52e18e2e54dc457067daf2803adbf85851ab7afa730339d962a3c53246894589"} Apr 22 16:10:22.978986 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:22.978951 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" event={"ID":"fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15","Type":"ContainerStarted","Data":"7eef6087b4ae82f37a97c1be1f00abc6986acbf7a53bb909849bbd1f6293d6ab"} Apr 22 16:10:22.979396 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:22.979166 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" Apr 22 16:10:22.997411 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:22.997358 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" podStartSLOduration=1.644178461 podStartE2EDuration="15.997344259s" podCreationTimestamp="2026-04-22 16:10:07 +0000 UTC" firstStartedPulling="2026-04-22 16:10:07.710467069 +0000 UTC m=+688.610021119" lastFinishedPulling="2026-04-22 16:10:22.063632864 +0000 UTC m=+702.963186917" observedRunningTime="2026-04-22 16:10:22.994547755 +0000 UTC m=+703.894101822" watchObservedRunningTime="2026-04-22 16:10:22.997344259 +0000 UTC m=+703.896898338" Apr 22 16:10:33.995913 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:33.995839 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k" Apr 22 16:10:47.454874 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:47.454845 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v"] Apr 22 16:10:47.478170 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:47.478142 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v"] Apr 22 16:10:47.478341 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:47.478291 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" Apr 22 16:10:47.480655 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:47.480633 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 22 16:10:47.559961 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:47.559928 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fce68264-52f2-4ac1-8162-43d838d95a89-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-s5v2v\" (UID: \"fce68264-52f2-4ac1-8162-43d838d95a89\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" Apr 22 16:10:47.559961 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:47.559964 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fce68264-52f2-4ac1-8162-43d838d95a89-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-s5v2v\" (UID: \"fce68264-52f2-4ac1-8162-43d838d95a89\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" Apr 22 16:10:47.560161 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:47.560011 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fce68264-52f2-4ac1-8162-43d838d95a89-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-s5v2v\" (UID: \"fce68264-52f2-4ac1-8162-43d838d95a89\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" Apr 22 16:10:47.560161 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:47.560056 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nk6l\" (UniqueName: \"kubernetes.io/projected/fce68264-52f2-4ac1-8162-43d838d95a89-kube-api-access-5nk6l\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-s5v2v\" (UID: \"fce68264-52f2-4ac1-8162-43d838d95a89\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" Apr 22 16:10:47.560161 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:47.560090 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fce68264-52f2-4ac1-8162-43d838d95a89-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-s5v2v\" (UID: \"fce68264-52f2-4ac1-8162-43d838d95a89\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" Apr 22 16:10:47.560161 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:47.560137 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fce68264-52f2-4ac1-8162-43d838d95a89-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-s5v2v\" (UID: \"fce68264-52f2-4ac1-8162-43d838d95a89\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" Apr 22 16:10:47.661644 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:47.661603 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fce68264-52f2-4ac1-8162-43d838d95a89-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-s5v2v\" (UID: \"fce68264-52f2-4ac1-8162-43d838d95a89\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" Apr 22 16:10:47.661841 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:47.661671 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nk6l\" (UniqueName: \"kubernetes.io/projected/fce68264-52f2-4ac1-8162-43d838d95a89-kube-api-access-5nk6l\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-s5v2v\" (UID: \"fce68264-52f2-4ac1-8162-43d838d95a89\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" Apr 22 16:10:47.661841 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:47.661711 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fce68264-52f2-4ac1-8162-43d838d95a89-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-s5v2v\" (UID: \"fce68264-52f2-4ac1-8162-43d838d95a89\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" Apr 22 16:10:47.661841 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:47.661790 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fce68264-52f2-4ac1-8162-43d838d95a89-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-s5v2v\" (UID: \"fce68264-52f2-4ac1-8162-43d838d95a89\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" Apr 22 16:10:47.661999 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:47.661858 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fce68264-52f2-4ac1-8162-43d838d95a89-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-s5v2v\" (UID: \"fce68264-52f2-4ac1-8162-43d838d95a89\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" Apr 22 16:10:47.661999 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:47.661892 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fce68264-52f2-4ac1-8162-43d838d95a89-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-s5v2v\" (UID: \"fce68264-52f2-4ac1-8162-43d838d95a89\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" Apr 22 16:10:47.663225 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:47.662792 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fce68264-52f2-4ac1-8162-43d838d95a89-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-s5v2v\" (UID: \"fce68264-52f2-4ac1-8162-43d838d95a89\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" Apr 22 16:10:47.663225 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:47.662880 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fce68264-52f2-4ac1-8162-43d838d95a89-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-s5v2v\" (UID: \"fce68264-52f2-4ac1-8162-43d838d95a89\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" Apr 22 16:10:47.663418 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:47.663253 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fce68264-52f2-4ac1-8162-43d838d95a89-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-s5v2v\" (UID: \"fce68264-52f2-4ac1-8162-43d838d95a89\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" Apr 22 16:10:47.669172 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:47.666856 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fce68264-52f2-4ac1-8162-43d838d95a89-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-s5v2v\" (UID: \"fce68264-52f2-4ac1-8162-43d838d95a89\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" Apr 22 16:10:47.671286 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:47.671257 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fce68264-52f2-4ac1-8162-43d838d95a89-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-s5v2v\" (UID: \"fce68264-52f2-4ac1-8162-43d838d95a89\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" Apr 22 16:10:47.671835 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:47.671802 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nk6l\" (UniqueName: \"kubernetes.io/projected/fce68264-52f2-4ac1-8162-43d838d95a89-kube-api-access-5nk6l\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-s5v2v\" (UID: \"fce68264-52f2-4ac1-8162-43d838d95a89\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" Apr 22 16:10:47.788671 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:47.788587 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" Apr 22 16:10:47.914171 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:47.914100 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v"] Apr 22 16:10:47.916500 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:10:47.916468 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfce68264_52f2_4ac1_8162_43d838d95a89.slice/crio-b8961d7e2f1ee71250496f743ef8cc5da7759ac6a0677998d3b1d757f4a41b27 WatchSource:0}: Error finding container b8961d7e2f1ee71250496f743ef8cc5da7759ac6a0677998d3b1d757f4a41b27: Status 404 returned error can't find the container with id b8961d7e2f1ee71250496f743ef8cc5da7759ac6a0677998d3b1d757f4a41b27 Apr 22 16:10:48.069559 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:48.069454 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" event={"ID":"fce68264-52f2-4ac1-8162-43d838d95a89","Type":"ContainerStarted","Data":"cc0b8652d9539c0196c9f1060b68249b85a9d395f7bac82843d4e46a81dcb61c"} Apr 22 16:10:48.069559 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:48.069500 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" event={"ID":"fce68264-52f2-4ac1-8162-43d838d95a89","Type":"ContainerStarted","Data":"b8961d7e2f1ee71250496f743ef8cc5da7759ac6a0677998d3b1d757f4a41b27"} Apr 22 16:10:54.093394 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:54.093359 2572 generic.go:358] "Generic (PLEG): container finished" podID="fce68264-52f2-4ac1-8162-43d838d95a89" containerID="cc0b8652d9539c0196c9f1060b68249b85a9d395f7bac82843d4e46a81dcb61c" exitCode=0 Apr 22 16:10:54.093765 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:54.093432 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" event={"ID":"fce68264-52f2-4ac1-8162-43d838d95a89","Type":"ContainerDied","Data":"cc0b8652d9539c0196c9f1060b68249b85a9d395f7bac82843d4e46a81dcb61c"} Apr 22 16:10:55.099041 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:55.099007 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" event={"ID":"fce68264-52f2-4ac1-8162-43d838d95a89","Type":"ContainerStarted","Data":"72566e8b219be8f61c7ebe613190efd8f9f601fbc9202051f43053689c1d7202"} Apr 22 16:10:55.099527 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:55.099346 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" Apr 22 16:10:55.116848 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:10:55.116803 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" podStartSLOduration=7.570338363 podStartE2EDuration="8.116792312s" podCreationTimestamp="2026-04-22 16:10:47 +0000 UTC" firstStartedPulling="2026-04-22 16:10:54.094022561 +0000 UTC m=+734.993576610" lastFinishedPulling="2026-04-22 16:10:54.640476506 +0000 UTC m=+735.540030559" observedRunningTime="2026-04-22 16:10:55.115551183 +0000 UTC m=+736.015105254" watchObservedRunningTime="2026-04-22 16:10:55.116792312 +0000 UTC m=+736.016346382" Apr 22 16:11:06.116255 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:06.116223 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-s5v2v" Apr 22 16:11:19.756278 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:19.756234 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-569bb88bc4-h82nk"] Apr 22 16:11:19.758714 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:19.758698 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-569bb88bc4-h82nk" Apr 22 16:11:19.766428 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:19.766026 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-569bb88bc4-h82nk"] Apr 22 16:11:19.938263 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:19.938228 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwvrr\" (UniqueName: \"kubernetes.io/projected/a309407a-446a-4c50-8e74-cbcba0c5c773-kube-api-access-bwvrr\") pod \"authorino-569bb88bc4-h82nk\" (UID: \"a309407a-446a-4c50-8e74-cbcba0c5c773\") " pod="kuadrant-system/authorino-569bb88bc4-h82nk" Apr 22 16:11:19.938441 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:19.938273 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a309407a-446a-4c50-8e74-cbcba0c5c773-tls-cert\") pod \"authorino-569bb88bc4-h82nk\" (UID: \"a309407a-446a-4c50-8e74-cbcba0c5c773\") " pod="kuadrant-system/authorino-569bb88bc4-h82nk" Apr 22 16:11:20.039039 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:20.038941 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bwvrr\" (UniqueName: \"kubernetes.io/projected/a309407a-446a-4c50-8e74-cbcba0c5c773-kube-api-access-bwvrr\") pod \"authorino-569bb88bc4-h82nk\" (UID: \"a309407a-446a-4c50-8e74-cbcba0c5c773\") " pod="kuadrant-system/authorino-569bb88bc4-h82nk" Apr 22 16:11:20.039039 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:20.038992 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a309407a-446a-4c50-8e74-cbcba0c5c773-tls-cert\") pod \"authorino-569bb88bc4-h82nk\" (UID: \"a309407a-446a-4c50-8e74-cbcba0c5c773\") " pod="kuadrant-system/authorino-569bb88bc4-h82nk" Apr 22 16:11:20.041640 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:20.041613 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/a309407a-446a-4c50-8e74-cbcba0c5c773-tls-cert\") pod \"authorino-569bb88bc4-h82nk\" (UID: \"a309407a-446a-4c50-8e74-cbcba0c5c773\") " pod="kuadrant-system/authorino-569bb88bc4-h82nk" Apr 22 16:11:20.046644 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:20.046622 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwvrr\" (UniqueName: \"kubernetes.io/projected/a309407a-446a-4c50-8e74-cbcba0c5c773-kube-api-access-bwvrr\") pod \"authorino-569bb88bc4-h82nk\" (UID: \"a309407a-446a-4c50-8e74-cbcba0c5c773\") " pod="kuadrant-system/authorino-569bb88bc4-h82nk" Apr 22 16:11:20.069574 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:20.069541 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-569bb88bc4-h82nk" Apr 22 16:11:20.198345 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:20.198314 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-569bb88bc4-h82nk"] Apr 22 16:11:20.200774 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:11:20.200745 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda309407a_446a_4c50_8e74_cbcba0c5c773.slice/crio-3a6958f4265ff4947ed119ab1b1817bc0ec9e7f8f26006291a0ce8e0ea619d50 WatchSource:0}: Error finding container 3a6958f4265ff4947ed119ab1b1817bc0ec9e7f8f26006291a0ce8e0ea619d50: Status 404 returned error can't find the container with id 3a6958f4265ff4947ed119ab1b1817bc0ec9e7f8f26006291a0ce8e0ea619d50 Apr 22 16:11:21.201059 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:21.201021 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-569bb88bc4-h82nk" event={"ID":"a309407a-446a-4c50-8e74-cbcba0c5c773","Type":"ContainerStarted","Data":"2922d1fe03a032d9d7598f46efa11770e96ae909cd89b80628fb8faa59d7abbd"} Apr 22 16:11:21.201059 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:21.201065 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-569bb88bc4-h82nk" event={"ID":"a309407a-446a-4c50-8e74-cbcba0c5c773","Type":"ContainerStarted","Data":"3a6958f4265ff4947ed119ab1b1817bc0ec9e7f8f26006291a0ce8e0ea619d50"} Apr 22 16:11:21.229318 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:21.229273 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-569bb88bc4-h82nk" podStartSLOduration=1.753081149 podStartE2EDuration="2.22925967s" podCreationTimestamp="2026-04-22 16:11:19 +0000 UTC" firstStartedPulling="2026-04-22 16:11:20.202284985 +0000 UTC m=+761.101839034" lastFinishedPulling="2026-04-22 16:11:20.678463503 +0000 UTC m=+761.578017555" observedRunningTime="2026-04-22 16:11:21.22678178 +0000 UTC m=+762.126335880" watchObservedRunningTime="2026-04-22 16:11:21.22925967 +0000 UTC m=+762.128813740" Apr 22 16:11:21.296309 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:21.296266 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-69c6ffbd8c-f8mhh"] Apr 22 16:11:21.296544 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:21.296520 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-69c6ffbd8c-f8mhh" podUID="613dbe12-2e1c-43b9-86bf-f3064082cd7c" containerName="authorino" containerID="cri-o://e7319ea5589559f96779d0dd332f04be52f8c3be579eeec590b0b391a757231a" gracePeriod=30 Apr 22 16:11:21.542056 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:21.542016 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-69c6ffbd8c-f8mhh" Apr 22 16:11:21.552099 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:21.552074 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmdf8\" (UniqueName: \"kubernetes.io/projected/613dbe12-2e1c-43b9-86bf-f3064082cd7c-kube-api-access-fmdf8\") pod \"613dbe12-2e1c-43b9-86bf-f3064082cd7c\" (UID: \"613dbe12-2e1c-43b9-86bf-f3064082cd7c\") " Apr 22 16:11:21.552302 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:21.552105 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/613dbe12-2e1c-43b9-86bf-f3064082cd7c-tls-cert\") pod \"613dbe12-2e1c-43b9-86bf-f3064082cd7c\" (UID: \"613dbe12-2e1c-43b9-86bf-f3064082cd7c\") " Apr 22 16:11:21.554986 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:21.554952 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/613dbe12-2e1c-43b9-86bf-f3064082cd7c-kube-api-access-fmdf8" (OuterVolumeSpecName: "kube-api-access-fmdf8") pod "613dbe12-2e1c-43b9-86bf-f3064082cd7c" (UID: "613dbe12-2e1c-43b9-86bf-f3064082cd7c"). InnerVolumeSpecName "kube-api-access-fmdf8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:11:21.566117 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:21.566082 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/613dbe12-2e1c-43b9-86bf-f3064082cd7c-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "613dbe12-2e1c-43b9-86bf-f3064082cd7c" (UID: "613dbe12-2e1c-43b9-86bf-f3064082cd7c"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:11:21.652636 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:21.652610 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fmdf8\" (UniqueName: \"kubernetes.io/projected/613dbe12-2e1c-43b9-86bf-f3064082cd7c-kube-api-access-fmdf8\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:11:21.652636 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:21.652636 2572 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/613dbe12-2e1c-43b9-86bf-f3064082cd7c-tls-cert\") on node \"ip-10-0-135-9.ec2.internal\" DevicePath \"\"" Apr 22 16:11:22.206149 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:22.206113 2572 generic.go:358] "Generic (PLEG): container finished" podID="613dbe12-2e1c-43b9-86bf-f3064082cd7c" containerID="e7319ea5589559f96779d0dd332f04be52f8c3be579eeec590b0b391a757231a" exitCode=0 Apr 22 16:11:22.206611 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:22.206164 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-69c6ffbd8c-f8mhh" Apr 22 16:11:22.206611 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:22.206225 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-69c6ffbd8c-f8mhh" event={"ID":"613dbe12-2e1c-43b9-86bf-f3064082cd7c","Type":"ContainerDied","Data":"e7319ea5589559f96779d0dd332f04be52f8c3be579eeec590b0b391a757231a"} Apr 22 16:11:22.206611 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:22.206265 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-69c6ffbd8c-f8mhh" event={"ID":"613dbe12-2e1c-43b9-86bf-f3064082cd7c","Type":"ContainerDied","Data":"88d2484d67ebb0d22612cf6a820800ad51d8cec5d14777cb5c588727a159a101"} Apr 22 16:11:22.206611 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:22.206280 2572 scope.go:117] "RemoveContainer" containerID="e7319ea5589559f96779d0dd332f04be52f8c3be579eeec590b0b391a757231a" Apr 22 16:11:22.215063 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:22.215044 2572 scope.go:117] "RemoveContainer" containerID="e7319ea5589559f96779d0dd332f04be52f8c3be579eeec590b0b391a757231a" Apr 22 16:11:22.215415 ip-10-0-135-9 kubenswrapper[2572]: E0422 16:11:22.215399 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7319ea5589559f96779d0dd332f04be52f8c3be579eeec590b0b391a757231a\": container with ID starting with e7319ea5589559f96779d0dd332f04be52f8c3be579eeec590b0b391a757231a not found: ID does not exist" containerID="e7319ea5589559f96779d0dd332f04be52f8c3be579eeec590b0b391a757231a" Apr 22 16:11:22.215474 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:22.215423 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7319ea5589559f96779d0dd332f04be52f8c3be579eeec590b0b391a757231a"} err="failed to get container status \"e7319ea5589559f96779d0dd332f04be52f8c3be579eeec590b0b391a757231a\": rpc error: code = NotFound desc = could not find container \"e7319ea5589559f96779d0dd332f04be52f8c3be579eeec590b0b391a757231a\": container with ID starting with e7319ea5589559f96779d0dd332f04be52f8c3be579eeec590b0b391a757231a not found: ID does not exist" Apr 22 16:11:22.220790 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:22.220766 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-69c6ffbd8c-f8mhh"] Apr 22 16:11:22.224350 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:22.224326 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-69c6ffbd8c-f8mhh"] Apr 22 16:11:23.645319 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:11:23.645281 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="613dbe12-2e1c-43b9-86bf-f3064082cd7c" path="/var/lib/kubelet/pods/613dbe12-2e1c-43b9-86bf-f3064082cd7c/volumes" Apr 22 16:13:21.921149 ip-10-0-135-9 kubenswrapper[2572]: E0422 16:13:21.921117 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/systemd-tmpfiles-clean.service\": RecentStats: unable to find data in memory cache]" Apr 22 16:13:31.965141 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:31.965056 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-569bb88bc4-h82nk_a309407a-446a-4c50-8e74-cbcba0c5c773/authorino/0.log" Apr 22 16:13:36.707156 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:36.707124 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-54dfb4598d-7pws5_9d0807db-368f-4d23-a54f-aba01a637eef/manager/0.log" Apr 22 16:13:37.666697 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:37.666668 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62_38c23430-348a-4872-9a3a-c52871fc1766/util/0.log" Apr 22 16:13:37.673400 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:37.673376 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62_38c23430-348a-4872-9a3a-c52871fc1766/pull/0.log" Apr 22 16:13:37.679566 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:37.679541 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62_38c23430-348a-4872-9a3a-c52871fc1766/extract/0.log" Apr 22 16:13:37.792133 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:37.792109 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j_957b9816-c3b2-4085-9c84-b215810dd6ca/util/0.log" Apr 22 16:13:37.798148 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:37.798112 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j_957b9816-c3b2-4085-9c84-b215810dd6ca/pull/0.log" Apr 22 16:13:37.803753 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:37.803731 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j_957b9816-c3b2-4085-9c84-b215810dd6ca/extract/0.log" Apr 22 16:13:37.908667 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:37.908641 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9_53b8f533-ce84-45f1-913e-fa4b16c3cea9/util/0.log" Apr 22 16:13:37.914465 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:37.914445 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9_53b8f533-ce84-45f1-913e-fa4b16c3cea9/pull/0.log" Apr 22 16:13:37.920310 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:37.920241 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9_53b8f533-ce84-45f1-913e-fa4b16c3cea9/extract/0.log" Apr 22 16:13:38.027825 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:38.027794 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l_f54c5736-e675-4465-a724-f1b413683899/pull/0.log" Apr 22 16:13:38.033640 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:38.033617 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l_f54c5736-e675-4465-a724-f1b413683899/extract/0.log" Apr 22 16:13:38.038942 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:38.038923 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l_f54c5736-e675-4465-a724-f1b413683899/util/0.log" Apr 22 16:13:38.154735 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:38.154691 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-569bb88bc4-h82nk_a309407a-446a-4c50-8e74-cbcba0c5c773/authorino/0.log" Apr 22 16:13:38.265164 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:38.265063 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-qtr2m_22a670b9-8247-4770-8552-d1c70e953050/manager/0.log" Apr 22 16:13:38.369147 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:38.369112 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-zvlgm_7d93114d-1c93-4d36-86b9-65f0997ff998/manager/0.log" Apr 22 16:13:38.574914 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:38.574886 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-flrd4_866a656c-4a80-40bb-9ca9-9162f1afea85/registry-server/0.log" Apr 22 16:13:38.691489 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:38.691462 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-tbrwn_cd66014d-0f3d-4db1-8b88-684f4d91bebf/manager/0.log" Apr 22 16:13:39.213211 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:39.213147 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4_e23cc38c-1adb-4c8a-ae0a-759d22d95fcf/istio-proxy/0.log" Apr 22 16:13:39.575149 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:39.575118 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxznf_fd3073fe-435c-4974-821b-9229018bf5f4/ovn-acl-logging/0.log" Apr 22 16:13:39.575343 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:39.575256 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxznf_fd3073fe-435c-4974-821b-9229018bf5f4/ovn-acl-logging/0.log" Apr 22 16:13:39.630372 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:39.630322 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-sbvm6_23455089-f91a-4266-8b97-86fb67ed9d63/istio-proxy/0.log" Apr 22 16:13:40.190652 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:40.190616 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k_fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15/main/0.log" Apr 22 16:13:40.197297 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:40.197267 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-jp29k_fd7a9a3e-dc94-48f3-b9d2-a9b53d2f8a15/storage-initializer/0.log" Apr 22 16:13:40.303933 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:40.303894 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-s5v2v_fce68264-52f2-4ac1-8162-43d838d95a89/storage-initializer/0.log" Apr 22 16:13:40.311064 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:40.311037 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-s5v2v_fce68264-52f2-4ac1-8162-43d838d95a89/main/0.log" Apr 22 16:13:44.176042 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:44.176008 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b5p7f/must-gather-pkcpb"] Apr 22 16:13:44.176450 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:44.176353 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="613dbe12-2e1c-43b9-86bf-f3064082cd7c" containerName="authorino" Apr 22 16:13:44.176450 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:44.176364 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="613dbe12-2e1c-43b9-86bf-f3064082cd7c" containerName="authorino" Apr 22 16:13:44.176450 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:44.176420 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="613dbe12-2e1c-43b9-86bf-f3064082cd7c" containerName="authorino" Apr 22 16:13:44.179441 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:44.179425 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b5p7f/must-gather-pkcpb" Apr 22 16:13:44.181994 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:44.181965 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-b5p7f\"/\"kube-root-ca.crt\"" Apr 22 16:13:44.181994 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:44.181986 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-b5p7f\"/\"openshift-service-ca.crt\"" Apr 22 16:13:44.182858 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:44.182840 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-b5p7f\"/\"default-dockercfg-bfrf8\"" Apr 22 16:13:44.196928 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:44.196906 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b5p7f/must-gather-pkcpb"] Apr 22 16:13:44.237687 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:44.237656 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78vd7\" (UniqueName: \"kubernetes.io/projected/21fba433-1cdb-4dc4-8b3f-0888ed4da304-kube-api-access-78vd7\") pod \"must-gather-pkcpb\" (UID: \"21fba433-1cdb-4dc4-8b3f-0888ed4da304\") " pod="openshift-must-gather-b5p7f/must-gather-pkcpb" Apr 22 16:13:44.237858 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:44.237723 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/21fba433-1cdb-4dc4-8b3f-0888ed4da304-must-gather-output\") pod \"must-gather-pkcpb\" (UID: \"21fba433-1cdb-4dc4-8b3f-0888ed4da304\") " pod="openshift-must-gather-b5p7f/must-gather-pkcpb" Apr 22 16:13:44.339125 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:44.339095 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78vd7\" (UniqueName: \"kubernetes.io/projected/21fba433-1cdb-4dc4-8b3f-0888ed4da304-kube-api-access-78vd7\") pod \"must-gather-pkcpb\" (UID: \"21fba433-1cdb-4dc4-8b3f-0888ed4da304\") " pod="openshift-must-gather-b5p7f/must-gather-pkcpb" Apr 22 16:13:44.339336 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:44.339162 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/21fba433-1cdb-4dc4-8b3f-0888ed4da304-must-gather-output\") pod \"must-gather-pkcpb\" (UID: \"21fba433-1cdb-4dc4-8b3f-0888ed4da304\") " pod="openshift-must-gather-b5p7f/must-gather-pkcpb" Apr 22 16:13:44.339554 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:44.339536 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/21fba433-1cdb-4dc4-8b3f-0888ed4da304-must-gather-output\") pod \"must-gather-pkcpb\" (UID: \"21fba433-1cdb-4dc4-8b3f-0888ed4da304\") " pod="openshift-must-gather-b5p7f/must-gather-pkcpb" Apr 22 16:13:44.346965 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:44.346935 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78vd7\" (UniqueName: \"kubernetes.io/projected/21fba433-1cdb-4dc4-8b3f-0888ed4da304-kube-api-access-78vd7\") pod \"must-gather-pkcpb\" (UID: \"21fba433-1cdb-4dc4-8b3f-0888ed4da304\") " pod="openshift-must-gather-b5p7f/must-gather-pkcpb" Apr 22 16:13:44.488148 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:44.488067 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b5p7f/must-gather-pkcpb" Apr 22 16:13:44.616038 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:44.615884 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b5p7f/must-gather-pkcpb"] Apr 22 16:13:44.618754 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:13:44.618727 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21fba433_1cdb_4dc4_8b3f_0888ed4da304.slice/crio-0685b32c67ed75ed79db43e400b76ae6c4cb3fb1eedc655354807c6461623efc WatchSource:0}: Error finding container 0685b32c67ed75ed79db43e400b76ae6c4cb3fb1eedc655354807c6461623efc: Status 404 returned error can't find the container with id 0685b32c67ed75ed79db43e400b76ae6c4cb3fb1eedc655354807c6461623efc Apr 22 16:13:44.728446 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:44.728407 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b5p7f/must-gather-pkcpb" event={"ID":"21fba433-1cdb-4dc4-8b3f-0888ed4da304","Type":"ContainerStarted","Data":"0685b32c67ed75ed79db43e400b76ae6c4cb3fb1eedc655354807c6461623efc"} Apr 22 16:13:45.736676 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:45.736629 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b5p7f/must-gather-pkcpb" event={"ID":"21fba433-1cdb-4dc4-8b3f-0888ed4da304","Type":"ContainerStarted","Data":"97182bdbdf2f0fe96fd2732f77b44d0a6cac2730efed4f22826570a36d388570"} Apr 22 16:13:46.744431 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:46.744170 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b5p7f/must-gather-pkcpb" event={"ID":"21fba433-1cdb-4dc4-8b3f-0888ed4da304","Type":"ContainerStarted","Data":"1521995bb0a05099a08508a7b9b53610a62185577db0b0dc2f0a8bfe6cf426d6"} Apr 22 16:13:46.759026 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:46.758978 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b5p7f/must-gather-pkcpb" podStartSLOduration=1.805534982 podStartE2EDuration="2.758962354s" podCreationTimestamp="2026-04-22 16:13:44 +0000 UTC" firstStartedPulling="2026-04-22 16:13:44.620609945 +0000 UTC m=+905.520163997" lastFinishedPulling="2026-04-22 16:13:45.574037317 +0000 UTC m=+906.473591369" observedRunningTime="2026-04-22 16:13:46.758151004 +0000 UTC m=+907.657705080" watchObservedRunningTime="2026-04-22 16:13:46.758962354 +0000 UTC m=+907.658516424" Apr 22 16:13:47.192005 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:47.191973 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-rs5cd_c2282f87-8976-4573-9b9c-d12f85477077/global-pull-secret-syncer/0.log" Apr 22 16:13:47.302614 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:47.302579 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-nbrr9_9ebd4e04-7111-4378-9b6d-f2d25a0e4642/konnectivity-agent/0.log" Apr 22 16:13:47.396940 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:47.396898 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-9.ec2.internal_4bba7b470f499a19673a3db16932ed93/haproxy/0.log" Apr 22 16:13:51.567541 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:51.567511 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62_38c23430-348a-4872-9a3a-c52871fc1766/extract/0.log" Apr 22 16:13:51.595945 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:51.595910 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62_38c23430-348a-4872-9a3a-c52871fc1766/util/0.log" Apr 22 16:13:51.620737 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:51.620705 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7596xt62_38c23430-348a-4872-9a3a-c52871fc1766/pull/0.log" Apr 22 16:13:51.651477 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:51.651437 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j_957b9816-c3b2-4085-9c84-b215810dd6ca/extract/0.log" Apr 22 16:13:51.677345 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:51.677319 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j_957b9816-c3b2-4085-9c84-b215810dd6ca/util/0.log" Apr 22 16:13:51.699739 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:51.699703 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e08rk6j_957b9816-c3b2-4085-9c84-b215810dd6ca/pull/0.log" Apr 22 16:13:51.727246 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:51.727182 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9_53b8f533-ce84-45f1-913e-fa4b16c3cea9/extract/0.log" Apr 22 16:13:51.752239 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:51.752179 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9_53b8f533-ce84-45f1-913e-fa4b16c3cea9/util/0.log" Apr 22 16:13:51.776820 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:51.776790 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73kj2g9_53b8f533-ce84-45f1-913e-fa4b16c3cea9/pull/0.log" Apr 22 16:13:51.808434 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:51.808403 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l_f54c5736-e675-4465-a724-f1b413683899/extract/0.log" Apr 22 16:13:51.831442 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:51.831404 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l_f54c5736-e675-4465-a724-f1b413683899/util/0.log" Apr 22 16:13:51.862181 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:51.862145 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1c459l_f54c5736-e675-4465-a724-f1b413683899/pull/0.log" Apr 22 16:13:51.900450 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:51.900407 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-569bb88bc4-h82nk_a309407a-446a-4c50-8e74-cbcba0c5c773/authorino/0.log" Apr 22 16:13:51.937039 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:51.937001 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-qtr2m_22a670b9-8247-4770-8552-d1c70e953050/manager/0.log" Apr 22 16:13:51.977496 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:51.977465 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-zvlgm_7d93114d-1c93-4d36-86b9-65f0997ff998/manager/0.log" Apr 22 16:13:52.038951 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:52.038919 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-flrd4_866a656c-4a80-40bb-9ca9-9162f1afea85/registry-server/0.log" Apr 22 16:13:52.095358 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:52.095278 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-tbrwn_cd66014d-0f3d-4db1-8b88-684f4d91bebf/manager/0.log" Apr 22 16:13:54.083651 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:54.083619 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tqbzl_26908f87-5ff2-4d32-a9a8-20e451a88f3b/node-exporter/0.log" Apr 22 16:13:54.107457 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:54.107428 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tqbzl_26908f87-5ff2-4d32-a9a8-20e451a88f3b/kube-rbac-proxy/0.log" Apr 22 16:13:54.128372 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:54.128303 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tqbzl_26908f87-5ff2-4d32-a9a8-20e451a88f3b/init-textfile/0.log" Apr 22 16:13:56.016887 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:56.016848 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n"] Apr 22 16:13:56.022476 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:56.022420 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n" Apr 22 16:13:56.023999 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:56.023974 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n"] Apr 22 16:13:56.159563 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:56.159518 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f73b8a4c-0b79-48ea-88b2-190ee87aeff7-podres\") pod \"perf-node-gather-daemonset-9b96n\" (UID: \"f73b8a4c-0b79-48ea-88b2-190ee87aeff7\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n" Apr 22 16:13:56.159761 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:56.159588 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpvbz\" (UniqueName: \"kubernetes.io/projected/f73b8a4c-0b79-48ea-88b2-190ee87aeff7-kube-api-access-xpvbz\") pod \"perf-node-gather-daemonset-9b96n\" (UID: \"f73b8a4c-0b79-48ea-88b2-190ee87aeff7\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n" Apr 22 16:13:56.159761 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:56.159673 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f73b8a4c-0b79-48ea-88b2-190ee87aeff7-lib-modules\") pod \"perf-node-gather-daemonset-9b96n\" (UID: \"f73b8a4c-0b79-48ea-88b2-190ee87aeff7\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n" Apr 22 16:13:56.159761 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:56.159719 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f73b8a4c-0b79-48ea-88b2-190ee87aeff7-proc\") pod \"perf-node-gather-daemonset-9b96n\" (UID: \"f73b8a4c-0b79-48ea-88b2-190ee87aeff7\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n" Apr 22 16:13:56.159761 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:56.159742 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f73b8a4c-0b79-48ea-88b2-190ee87aeff7-sys\") pod \"perf-node-gather-daemonset-9b96n\" (UID: \"f73b8a4c-0b79-48ea-88b2-190ee87aeff7\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n" Apr 22 16:13:56.260493 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:56.260449 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f73b8a4c-0b79-48ea-88b2-190ee87aeff7-lib-modules\") pod \"perf-node-gather-daemonset-9b96n\" (UID: \"f73b8a4c-0b79-48ea-88b2-190ee87aeff7\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n" Apr 22 16:13:56.260677 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:56.260523 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f73b8a4c-0b79-48ea-88b2-190ee87aeff7-proc\") pod \"perf-node-gather-daemonset-9b96n\" (UID: \"f73b8a4c-0b79-48ea-88b2-190ee87aeff7\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n" Apr 22 16:13:56.260677 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:56.260551 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f73b8a4c-0b79-48ea-88b2-190ee87aeff7-sys\") pod \"perf-node-gather-daemonset-9b96n\" (UID: \"f73b8a4c-0b79-48ea-88b2-190ee87aeff7\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n" Apr 22 16:13:56.260677 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:56.260613 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f73b8a4c-0b79-48ea-88b2-190ee87aeff7-podres\") pod \"perf-node-gather-daemonset-9b96n\" (UID: \"f73b8a4c-0b79-48ea-88b2-190ee87aeff7\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n" Apr 22 16:13:56.260677 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:56.260650 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpvbz\" (UniqueName: \"kubernetes.io/projected/f73b8a4c-0b79-48ea-88b2-190ee87aeff7-kube-api-access-xpvbz\") pod \"perf-node-gather-daemonset-9b96n\" (UID: \"f73b8a4c-0b79-48ea-88b2-190ee87aeff7\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n" Apr 22 16:13:56.261227 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:56.261178 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f73b8a4c-0b79-48ea-88b2-190ee87aeff7-lib-modules\") pod \"perf-node-gather-daemonset-9b96n\" (UID: \"f73b8a4c-0b79-48ea-88b2-190ee87aeff7\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n" Apr 22 16:13:56.261332 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:56.261224 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f73b8a4c-0b79-48ea-88b2-190ee87aeff7-sys\") pod \"perf-node-gather-daemonset-9b96n\" (UID: \"f73b8a4c-0b79-48ea-88b2-190ee87aeff7\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n" Apr 22 16:13:56.261332 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:56.261236 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f73b8a4c-0b79-48ea-88b2-190ee87aeff7-proc\") pod \"perf-node-gather-daemonset-9b96n\" (UID: \"f73b8a4c-0b79-48ea-88b2-190ee87aeff7\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n" Apr 22 16:13:56.261332 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:56.261298 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f73b8a4c-0b79-48ea-88b2-190ee87aeff7-podres\") pod \"perf-node-gather-daemonset-9b96n\" (UID: \"f73b8a4c-0b79-48ea-88b2-190ee87aeff7\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n" Apr 22 16:13:56.268831 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:56.268759 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpvbz\" (UniqueName: \"kubernetes.io/projected/f73b8a4c-0b79-48ea-88b2-190ee87aeff7-kube-api-access-xpvbz\") pod \"perf-node-gather-daemonset-9b96n\" (UID: \"f73b8a4c-0b79-48ea-88b2-190ee87aeff7\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n" Apr 22 16:13:56.337146 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:56.337116 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n" Apr 22 16:13:56.500070 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:56.500035 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n"] Apr 22 16:13:56.503231 ip-10-0-135-9 kubenswrapper[2572]: W0422 16:13:56.503171 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf73b8a4c_0b79_48ea_88b2_190ee87aeff7.slice/crio-fcdfbf26e2e29bdabb3179796496a8c1f8318b0734e27bb2d47a841eb6c59ba8 WatchSource:0}: Error finding container fcdfbf26e2e29bdabb3179796496a8c1f8318b0734e27bb2d47a841eb6c59ba8: Status 404 returned error can't find the container with id fcdfbf26e2e29bdabb3179796496a8c1f8318b0734e27bb2d47a841eb6c59ba8 Apr 22 16:13:56.797480 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:56.797404 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n" event={"ID":"f73b8a4c-0b79-48ea-88b2-190ee87aeff7","Type":"ContainerStarted","Data":"2d80fa7045c2cf41b863f6b9ff00d76270f72c90c1e914860d1940c02353c754"} Apr 22 16:13:56.797480 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:56.797439 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n" event={"ID":"f73b8a4c-0b79-48ea-88b2-190ee87aeff7","Type":"ContainerStarted","Data":"fcdfbf26e2e29bdabb3179796496a8c1f8318b0734e27bb2d47a841eb6c59ba8"} Apr 22 16:13:56.797707 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:56.797531 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n" Apr 22 16:13:56.811462 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:56.811419 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n" podStartSLOduration=0.811404362 podStartE2EDuration="811.404362ms" podCreationTimestamp="2026-04-22 16:13:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:13:56.810256662 +0000 UTC m=+917.709810732" watchObservedRunningTime="2026-04-22 16:13:56.811404362 +0000 UTC m=+917.710958433" Apr 22 16:13:57.334674 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:57.334639 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-xsb7b_c859ac10-b350-418d-b543-fde1e18ef074/volume-data-source-validator/0.log" Apr 22 16:13:58.177743 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:58.177712 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nfknt_f2e728dc-359a-4e6e-831e-f1b83f015c97/dns/0.log" Apr 22 16:13:58.198597 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:58.198560 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nfknt_f2e728dc-359a-4e6e-831e-f1b83f015c97/kube-rbac-proxy/0.log" Apr 22 16:13:58.268154 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:58.268129 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4sqfs_a60066e5-252b-4865-879a-0d0d3a6618d4/dns-node-resolver/0.log" Apr 22 16:13:58.858862 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:58.858832 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9d6jl_5756e223-5da3-420b-a640-5e3cdce35004/node-ca/0.log" Apr 22 16:13:59.688423 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:59.688392 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfwq8r4_e23cc38c-1adb-4c8a-ae0a-759d22d95fcf/istio-proxy/0.log" Apr 22 16:13:59.819082 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:13:59.819053 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-sbvm6_23455089-f91a-4266-8b97-86fb67ed9d63/istio-proxy/0.log" Apr 22 16:14:00.446706 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:00.446651 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-pcv67_11c29039-8c35-465e-8df3-408a688c08ed/serve-healthcheck-canary/0.log" Apr 22 16:14:00.927169 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:00.927136 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-jbznj_14ff43e0-e359-4557-9f79-d5452a8479a0/insights-operator/1.log" Apr 22 16:14:00.927635 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:00.927594 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-jbznj_14ff43e0-e359-4557-9f79-d5452a8479a0/insights-operator/0.log" Apr 22 16:14:01.079862 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:01.079826 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rkfvs_50e48078-0749-491a-b7f0-fec2248f200a/kube-rbac-proxy/0.log" Apr 22 16:14:01.100442 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:01.100415 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rkfvs_50e48078-0749-491a-b7f0-fec2248f200a/exporter/0.log" Apr 22 16:14:01.122090 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:01.122064 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rkfvs_50e48078-0749-491a-b7f0-fec2248f200a/extractor/0.log" Apr 22 16:14:02.813000 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:02.812971 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-9b96n" Apr 22 16:14:03.026629 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:03.026576 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-54dfb4598d-7pws5_9d0807db-368f-4d23-a54f-aba01a637eef/manager/0.log" Apr 22 16:14:04.179603 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:04.179562 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-7d868c4d86-94h4g_efc86556-664b-4f91-906f-18b172fc9c9c/manager/0.log" Apr 22 16:14:08.670418 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:08.670389 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-shl8s_cd8b8745-3849-4e7e-a5f5-60ba35f4329e/migrator/0.log" Apr 22 16:14:08.690971 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:08.690942 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-shl8s_cd8b8745-3849-4e7e-a5f5-60ba35f4329e/graceful-termination/0.log" Apr 22 16:14:09.043238 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:09.043135 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-j58f5_8d7ddd84-35ed-400b-ad69-647f50964d8c/kube-storage-version-migrator-operator/1.log" Apr 22 16:14:09.045175 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:09.045144 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-j58f5_8d7ddd84-35ed-400b-ad69-647f50964d8c/kube-storage-version-migrator-operator/0.log" Apr 22 16:14:10.055753 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:10.055723 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4tvm4_a1b11795-9e34-41fd-9198-cc57fa3cfbf7/kube-multus-additional-cni-plugins/0.log" Apr 22 16:14:10.077719 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:10.077688 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4tvm4_a1b11795-9e34-41fd-9198-cc57fa3cfbf7/egress-router-binary-copy/0.log" Apr 22 16:14:10.098480 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:10.098455 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4tvm4_a1b11795-9e34-41fd-9198-cc57fa3cfbf7/cni-plugins/0.log" Apr 22 16:14:10.119950 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:10.119918 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4tvm4_a1b11795-9e34-41fd-9198-cc57fa3cfbf7/bond-cni-plugin/0.log" Apr 22 16:14:10.142020 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:10.141998 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4tvm4_a1b11795-9e34-41fd-9198-cc57fa3cfbf7/routeoverride-cni/0.log" Apr 22 16:14:10.163433 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:10.163407 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4tvm4_a1b11795-9e34-41fd-9198-cc57fa3cfbf7/whereabouts-cni-bincopy/0.log" Apr 22 16:14:10.184031 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:10.184002 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4tvm4_a1b11795-9e34-41fd-9198-cc57fa3cfbf7/whereabouts-cni/0.log" Apr 22 16:14:10.544424 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:10.544400 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmd9q_757bc440-2a2c-42f8-8e5d-03be90e55484/kube-multus/0.log" Apr 22 16:14:10.681262 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:10.681229 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-76x4b_c090a1ee-5091-44d6-9e1b-65bf4dc8b1be/network-metrics-daemon/0.log" Apr 22 16:14:10.699810 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:10.699779 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-76x4b_c090a1ee-5091-44d6-9e1b-65bf4dc8b1be/kube-rbac-proxy/0.log" Apr 22 16:14:12.102686 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:12.102648 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxznf_fd3073fe-435c-4974-821b-9229018bf5f4/ovn-controller/0.log" Apr 22 16:14:12.119959 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:12.119926 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxznf_fd3073fe-435c-4974-821b-9229018bf5f4/ovn-acl-logging/0.log" Apr 22 16:14:12.129318 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:12.129292 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxznf_fd3073fe-435c-4974-821b-9229018bf5f4/ovn-acl-logging/1.log" Apr 22 16:14:12.149736 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:12.149704 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxznf_fd3073fe-435c-4974-821b-9229018bf5f4/kube-rbac-proxy-node/0.log" Apr 22 16:14:12.173492 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:12.173462 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxznf_fd3073fe-435c-4974-821b-9229018bf5f4/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 16:14:12.192913 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:12.192887 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxznf_fd3073fe-435c-4974-821b-9229018bf5f4/northd/0.log" Apr 22 16:14:12.214320 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:12.214293 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxznf_fd3073fe-435c-4974-821b-9229018bf5f4/nbdb/0.log" Apr 22 16:14:12.235363 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:12.235331 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxznf_fd3073fe-435c-4974-821b-9229018bf5f4/sbdb/0.log" Apr 22 16:14:12.414742 ip-10-0-135-9 kubenswrapper[2572]: I0422 16:14:12.414675 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xxznf_fd3073fe-435c-4974-821b-9229018bf5f4/ovnkube-controller/0.log"