Apr 24 22:27:07.943733 ip-10-0-135-222 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 22:27:07.943745 ip-10-0-135-222 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 22:27:07.943754 ip-10-0-135-222 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 22:27:07.944253 ip-10-0-135-222 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 22:27:18.184849 ip-10-0-135-222 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 22:27:18.184873 ip-10-0-135-222 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot a0479fa92ace4f039ca5ed24bb5fe844 -- Apr 24 22:29:39.921386 ip-10-0-135-222 systemd[1]: Starting Kubernetes Kubelet... Apr 24 22:29:40.380582 ip-10-0-135-222 kubenswrapper[2582]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:29:40.380582 ip-10-0-135-222 kubenswrapper[2582]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 22:29:40.380582 ip-10-0-135-222 kubenswrapper[2582]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:29:40.380582 ip-10-0-135-222 kubenswrapper[2582]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 22:29:40.381298 ip-10-0-135-222 kubenswrapper[2582]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:29:40.382681 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.382446 2582 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 22:29:40.387349 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387333 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:40.387349 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387349 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:40.387421 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387353 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:40.387421 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387356 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:40.387421 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387359 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:40.387421 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387363 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:40.387421 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387369 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:40.387421 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387374 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:40.387421 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387378 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:40.387421 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387381 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:40.387421 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387384 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:40.387421 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387387 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:40.387421 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387390 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:40.387421 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387393 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:40.387421 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387395 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:40.387421 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387398 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:40.387421 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387401 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:40.387421 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387403 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:40.387421 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387405 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:40.387421 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387408 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:40.387421 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387410 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:40.387912 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387413 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:40.387912 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387416 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:40.387912 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387419 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:40.387912 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387422 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:40.387912 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387424 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:40.387912 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387427 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:40.387912 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387430 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:40.387912 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387432 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:40.387912 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387443 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:40.387912 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387446 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:40.387912 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387449 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:40.387912 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387451 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:40.387912 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387454 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:40.387912 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387456 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:40.387912 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387459 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:40.387912 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387462 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:40.387912 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387464 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:40.387912 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387466 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:40.387912 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387471 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:40.387912 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387473 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:40.388400 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387475 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:40.388400 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387478 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:40.388400 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387481 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:40.388400 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387483 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:40.388400 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387486 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:40.388400 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387488 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:40.388400 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387491 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:40.388400 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387494 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:40.388400 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387496 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:40.388400 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387499 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:40.388400 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387501 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:40.388400 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387503 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:40.388400 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387506 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:40.388400 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387509 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:40.388400 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387512 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:40.388400 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387514 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:40.388400 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387517 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:40.388400 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387519 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:40.388400 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387522 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:40.388400 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387525 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:40.388979 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387527 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:40.388979 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387530 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:40.388979 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387532 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:40.388979 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387535 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:40.388979 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387537 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:40.388979 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387540 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:40.388979 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387542 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:40.388979 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387545 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:40.388979 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387547 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:40.388979 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387550 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:40.388979 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387552 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:40.388979 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387555 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:40.388979 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387559 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:40.388979 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387561 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:40.388979 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387564 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:40.388979 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387566 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:40.388979 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387569 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:40.388979 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387571 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:40.388979 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387574 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:40.388979 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387577 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:40.389489 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387579 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:40.389489 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387582 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:40.389489 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387584 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:40.389489 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387587 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:40.389489 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.387590 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:40.389489 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388000 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:40.389489 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388005 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:40.389489 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388008 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:40.389489 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388011 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:40.389489 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388015 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:40.389489 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388018 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:40.389489 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388021 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:40.389489 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388024 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:40.389489 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388027 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:40.389489 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388030 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:40.389489 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388033 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:40.389489 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388035 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:40.389489 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388038 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:40.389489 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388041 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:40.389945 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388043 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:40.389945 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388045 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:40.389945 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388048 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:40.389945 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388050 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:40.389945 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388053 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:40.389945 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388070 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:40.389945 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388073 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:40.389945 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388077 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:40.389945 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388079 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:40.389945 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388082 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:40.389945 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388085 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:40.389945 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388087 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:40.389945 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388090 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:40.389945 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388092 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:40.389945 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388095 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:40.389945 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388098 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:40.389945 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388100 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:40.389945 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388103 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:40.389945 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388106 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:40.390430 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388109 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:40.390430 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388112 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:40.390430 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388115 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:40.390430 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388117 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:40.390430 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388120 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:40.390430 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388123 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:40.390430 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388125 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:40.390430 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388128 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:40.390430 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388130 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:40.390430 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388133 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:40.390430 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388135 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:40.390430 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388137 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:40.390430 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388140 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:40.390430 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388143 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:40.390430 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388145 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:40.390430 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388148 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:40.390430 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388150 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:40.390430 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388153 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:40.390430 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388155 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:40.390430 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388157 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:40.390919 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388160 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:40.390919 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388163 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:40.390919 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388165 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:40.390919 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388168 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:40.390919 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388171 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:40.390919 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388174 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:40.390919 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388176 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:40.390919 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388179 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:40.390919 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388181 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:40.390919 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388184 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:40.390919 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388186 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:40.390919 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388189 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:40.390919 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388192 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:40.390919 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388194 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:40.390919 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388197 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:40.390919 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388199 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:40.390919 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388202 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:40.390919 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388205 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:40.390919 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388207 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:40.390919 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388211 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:40.391432 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388213 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:40.391432 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388216 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:40.391432 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388220 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:40.391432 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388223 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:40.391432 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388226 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:40.391432 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388229 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:40.391432 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388231 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:40.391432 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388233 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:40.391432 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388236 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:40.391432 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388238 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:40.391432 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388241 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:40.391432 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388243 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:40.391432 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388245 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:40.391432 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388327 2582 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 22:29:40.391432 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388335 2582 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 22:29:40.391432 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388341 2582 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 22:29:40.391432 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388346 2582 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 22:29:40.391432 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388351 2582 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 22:29:40.391432 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388354 2582 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 22:29:40.391432 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388359 2582 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 22:29:40.391942 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388363 2582 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 22:29:40.391942 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388367 2582 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 22:29:40.391942 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388370 2582 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 22:29:40.391942 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388374 2582 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 22:29:40.391942 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388378 2582 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 22:29:40.391942 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388381 2582 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 22:29:40.391942 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388384 2582 flags.go:64] FLAG: --cgroup-root="" Apr 24 22:29:40.391942 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388387 2582 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 22:29:40.391942 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388390 2582 flags.go:64] FLAG: --client-ca-file="" Apr 24 22:29:40.391942 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388393 2582 flags.go:64] FLAG: --cloud-config="" Apr 24 22:29:40.391942 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388395 2582 flags.go:64] FLAG: --cloud-provider="external" Apr 24 22:29:40.391942 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388398 2582 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 22:29:40.391942 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388411 2582 flags.go:64] FLAG: --cluster-domain="" Apr 24 22:29:40.391942 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388414 2582 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 22:29:40.391942 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388418 2582 flags.go:64] FLAG: --config-dir="" Apr 24 22:29:40.391942 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388421 2582 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 22:29:40.391942 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388424 2582 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 22:29:40.391942 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388433 2582 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 22:29:40.391942 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388435 2582 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 22:29:40.391942 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388439 2582 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 22:29:40.391942 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388442 2582 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 22:29:40.391942 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388445 2582 flags.go:64] FLAG: --contention-profiling="false" Apr 24 22:29:40.391942 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388448 2582 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 22:29:40.391942 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388451 2582 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388454 2582 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388457 2582 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388461 2582 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388465 2582 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388468 2582 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388470 2582 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388474 2582 flags.go:64] FLAG: --enable-server="true" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388477 2582 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388481 2582 flags.go:64] FLAG: --event-burst="100" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388484 2582 flags.go:64] FLAG: --event-qps="50" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388486 2582 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388490 2582 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388493 2582 flags.go:64] FLAG: --eviction-hard="" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388497 2582 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388500 2582 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388503 2582 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388507 2582 flags.go:64] FLAG: --eviction-soft="" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388510 2582 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388512 2582 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388515 2582 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388518 2582 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388521 2582 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388524 2582 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388527 2582 flags.go:64] FLAG: --feature-gates="" Apr 24 22:29:40.392529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388531 2582 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 22:29:40.393182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388534 2582 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 22:29:40.393182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388536 2582 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 22:29:40.393182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388540 2582 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 22:29:40.393182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388543 2582 flags.go:64] FLAG: --healthz-port="10248" Apr 24 22:29:40.393182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388545 2582 flags.go:64] FLAG: --help="false" Apr 24 22:29:40.393182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388548 2582 flags.go:64] FLAG: --hostname-override="ip-10-0-135-222.ec2.internal" Apr 24 22:29:40.393182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388551 2582 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 22:29:40.393182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388554 2582 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 22:29:40.393182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388557 2582 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 22:29:40.393182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388560 2582 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 22:29:40.393182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388563 2582 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 22:29:40.393182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388567 2582 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 22:29:40.393182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388570 2582 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 22:29:40.393182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388572 2582 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 22:29:40.393182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388575 2582 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 22:29:40.393182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388578 2582 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 22:29:40.393182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388581 2582 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 22:29:40.393182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388584 2582 flags.go:64] FLAG: --kube-reserved="" Apr 24 22:29:40.393182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388588 2582 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 22:29:40.393182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388591 2582 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 22:29:40.393182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388594 2582 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 22:29:40.393182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388597 2582 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 22:29:40.393182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388600 2582 flags.go:64] FLAG: --lock-file="" Apr 24 22:29:40.393182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388603 2582 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 22:29:40.393762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388606 2582 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 22:29:40.393762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388609 2582 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 22:29:40.393762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388615 2582 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 22:29:40.393762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388618 2582 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 22:29:40.393762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388621 2582 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 22:29:40.393762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388624 2582 flags.go:64] FLAG: --logging-format="text" Apr 24 22:29:40.393762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388627 2582 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 22:29:40.393762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388630 2582 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 22:29:40.393762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388633 2582 flags.go:64] FLAG: --manifest-url="" Apr 24 22:29:40.393762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388636 2582 flags.go:64] FLAG: --manifest-url-header="" Apr 24 22:29:40.393762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388640 2582 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 22:29:40.393762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388644 2582 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 22:29:40.393762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388648 2582 flags.go:64] FLAG: --max-pods="110" Apr 24 22:29:40.393762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388651 2582 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 22:29:40.393762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388654 2582 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 22:29:40.393762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388657 2582 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 22:29:40.393762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388660 2582 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 22:29:40.393762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388663 2582 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 22:29:40.393762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388666 2582 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 22:29:40.393762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388668 2582 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 22:29:40.393762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388677 2582 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 22:29:40.393762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388680 2582 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 22:29:40.393762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388683 2582 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 22:29:40.393762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388686 2582 flags.go:64] FLAG: --pod-cidr="" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388689 2582 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388695 2582 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388698 2582 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388701 2582 flags.go:64] FLAG: --pods-per-core="0" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388704 2582 flags.go:64] FLAG: --port="10250" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388707 2582 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388710 2582 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c1bdf1bd31e12941" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388713 2582 flags.go:64] FLAG: --qos-reserved="" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388716 2582 flags.go:64] FLAG: --read-only-port="10255" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388719 2582 flags.go:64] FLAG: --register-node="true" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388722 2582 flags.go:64] FLAG: --register-schedulable="true" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388725 2582 flags.go:64] FLAG: --register-with-taints="" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388728 2582 flags.go:64] FLAG: --registry-burst="10" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388731 2582 flags.go:64] FLAG: --registry-qps="5" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388734 2582 flags.go:64] FLAG: --reserved-cpus="" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388737 2582 flags.go:64] FLAG: --reserved-memory="" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388740 2582 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388743 2582 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388746 2582 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388749 2582 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388752 2582 flags.go:64] FLAG: --runonce="false" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388755 2582 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388758 2582 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388761 2582 flags.go:64] FLAG: --seccomp-default="false" Apr 24 22:29:40.394356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388764 2582 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388766 2582 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388769 2582 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388772 2582 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388775 2582 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388778 2582 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388782 2582 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388784 2582 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388787 2582 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388790 2582 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388793 2582 flags.go:64] FLAG: --system-cgroups="" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388796 2582 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388802 2582 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388805 2582 flags.go:64] FLAG: --tls-cert-file="" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388808 2582 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388812 2582 flags.go:64] FLAG: --tls-min-version="" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388815 2582 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388818 2582 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388821 2582 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388823 2582 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388826 2582 flags.go:64] FLAG: --v="2" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388831 2582 flags.go:64] FLAG: --version="false" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388835 2582 flags.go:64] FLAG: --vmodule="" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388839 2582 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.388842 2582 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 22:29:40.395020 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388935 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:40.395640 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388939 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:40.395640 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388942 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:40.395640 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388945 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:40.395640 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388947 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:40.395640 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388950 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:40.395640 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388953 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:40.395640 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388956 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:40.395640 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388959 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:40.395640 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388962 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:40.395640 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388965 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:40.395640 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388967 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:40.395640 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388969 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:40.395640 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388973 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:40.395640 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388975 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:40.395640 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388978 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:40.395640 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388980 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:40.395640 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388984 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:40.395640 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388987 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:40.395640 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388989 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:40.395640 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388992 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:40.396144 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388994 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:40.396144 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388997 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:40.396144 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.388999 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:40.396144 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389002 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:40.396144 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389004 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:40.396144 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389008 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:40.396144 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389012 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:40.396144 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389015 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:40.396144 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389018 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:40.396144 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389021 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:40.396144 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389024 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:40.396144 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389026 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:40.396144 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389029 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:40.396144 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389032 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:40.396144 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389035 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:40.396144 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389037 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:40.396144 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389040 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:40.396144 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389042 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:40.396144 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389045 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:40.396144 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389048 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:40.396633 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389068 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:40.396633 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389071 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:40.396633 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389073 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:40.396633 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389076 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:40.396633 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389079 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:40.396633 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389082 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:40.396633 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389085 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:40.396633 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389087 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:40.396633 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389090 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:40.396633 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389093 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:40.396633 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389096 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:40.396633 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389099 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:40.396633 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389102 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:40.396633 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389104 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:40.396633 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389107 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:40.396633 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389109 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:40.396633 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389112 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:40.396633 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389114 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:40.396633 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389118 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:40.396633 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389122 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:40.397156 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389124 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:40.397156 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389126 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:40.397156 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389129 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:40.397156 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389132 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:40.397156 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389134 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:40.397156 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389137 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:40.397156 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389141 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:40.397156 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389143 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:40.397156 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389146 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:40.397156 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389148 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:40.397156 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389151 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:40.397156 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389154 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:40.397156 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389158 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:40.397156 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389160 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:40.397156 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389163 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:40.397156 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389165 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:40.397156 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389168 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:40.397156 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389170 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:40.397156 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389173 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:40.397611 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389176 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:40.397611 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389178 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:40.397611 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389181 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:40.397611 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389183 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:40.397611 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389186 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:40.397611 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.389188 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:40.397611 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.389788 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:29:40.397611 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.396602 2582 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 22:29:40.397611 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.396620 2582 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 22:29:40.397611 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396673 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:40.397611 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396679 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:40.397611 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396683 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:40.397611 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396686 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:40.397611 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396690 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:40.397611 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396694 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:40.398025 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396697 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:40.398025 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396699 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:40.398025 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396702 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:40.398025 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396704 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:40.398025 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396707 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:40.398025 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396709 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:40.398025 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396712 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:40.398025 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396714 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:40.398025 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396717 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:40.398025 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396719 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:40.398025 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396722 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:40.398025 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396725 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:40.398025 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396727 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:40.398025 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396730 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:40.398025 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396733 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:40.398025 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396735 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:40.398025 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396738 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:40.398025 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396741 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:40.398025 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396743 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:40.398497 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396747 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:40.398497 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396751 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:40.398497 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396754 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:40.398497 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396757 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:40.398497 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396761 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:40.398497 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396764 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:40.398497 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396767 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:40.398497 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396770 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:40.398497 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396773 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:40.398497 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396776 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:40.398497 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396778 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:40.398497 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396781 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:40.398497 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396783 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:40.398497 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396785 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:40.398497 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396788 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:40.398497 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396790 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:40.398497 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396793 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:40.398497 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396795 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:40.398497 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396798 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:40.398958 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396800 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:40.398958 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396802 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:40.398958 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396805 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:40.398958 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396807 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:40.398958 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396810 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:40.398958 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396812 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:40.398958 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396814 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:40.398958 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396817 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:40.398958 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396820 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:40.398958 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396822 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:40.398958 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396825 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:40.398958 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396827 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:40.398958 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396830 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:40.398958 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396833 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:40.398958 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396835 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:40.398958 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396837 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:40.398958 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396840 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:40.398958 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396843 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:40.398958 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396845 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:40.398958 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396850 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:40.399481 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396852 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:40.399481 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396854 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:40.399481 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396857 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:40.399481 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396859 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:40.399481 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396862 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:40.399481 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396864 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:40.399481 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396867 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:40.399481 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396869 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:40.399481 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396871 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:40.399481 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396874 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:40.399481 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396876 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:40.399481 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396879 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:40.399481 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396881 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:40.399481 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396883 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:40.399481 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396886 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:40.399481 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396888 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:40.399481 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396891 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:40.399481 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396893 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:40.399481 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396896 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:40.399481 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396899 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:40.399965 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396901 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:40.399965 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.396904 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:40.399965 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.396909 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:29:40.399965 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397015 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:40.399965 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397020 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:40.399965 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397023 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:40.399965 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397026 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:40.399965 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397028 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:40.399965 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397031 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:40.399965 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397033 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:40.399965 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397036 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:40.399965 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397038 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:40.399965 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397041 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:40.399965 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397044 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:40.399965 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397048 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:40.400373 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397051 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:40.400373 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397073 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:40.400373 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397078 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:40.400373 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397082 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:40.400373 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397086 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:40.400373 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397091 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:40.400373 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397095 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:40.400373 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397099 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:40.400373 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397103 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:40.400373 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397106 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:40.400373 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397108 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:40.400373 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397111 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:40.400373 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397114 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:40.400373 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397116 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:40.400373 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397118 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:40.400373 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397121 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:40.400373 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397123 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:40.400373 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397126 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:40.400373 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397128 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:40.400373 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397130 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:40.400855 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397133 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:40.400855 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397135 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:40.400855 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397139 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:40.400855 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397141 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:40.400855 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397144 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:40.400855 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397146 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:40.400855 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397149 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:40.400855 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397151 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:40.400855 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397153 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:40.400855 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397156 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:40.400855 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397159 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:40.400855 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397162 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:40.400855 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397164 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:40.400855 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397167 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:40.400855 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397169 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:40.400855 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397172 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:40.400855 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397175 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:40.400855 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397177 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:40.400855 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397179 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:40.400855 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397182 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:40.401362 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397185 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:40.401362 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397187 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:40.401362 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397189 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:40.401362 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397192 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:40.401362 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397194 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:40.401362 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397196 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:40.401362 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397199 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:40.401362 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397201 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:40.401362 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397203 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:40.401362 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397206 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:40.401362 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397209 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:40.401362 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397212 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:40.401362 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397215 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:40.401362 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397217 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:40.401362 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397220 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:40.401362 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397229 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:40.401362 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397232 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:40.401362 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397234 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:40.401362 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397236 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:40.401362 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397239 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:40.401884 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397241 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:40.401884 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397244 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:40.401884 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397247 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:40.401884 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397249 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:40.401884 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397251 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:40.401884 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397254 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:40.401884 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397256 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:40.401884 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397258 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:40.401884 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397261 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:40.401884 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397263 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:40.401884 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397266 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:40.401884 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397268 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:40.401884 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397271 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:40.401884 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:40.397273 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:40.401884 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.397278 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:29:40.402362 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.398051 2582 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 22:29:40.403743 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.403726 2582 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 22:29:40.404702 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.404690 2582 server.go:1019] "Starting client certificate rotation" Apr 24 22:29:40.404816 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.404797 2582 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 22:29:40.404852 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.404843 2582 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 22:29:40.431413 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.431374 2582 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 22:29:40.438794 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.438766 2582 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 22:29:40.453513 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.453487 2582 log.go:25] "Validated CRI v1 runtime API" Apr 24 22:29:40.459255 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.459236 2582 log.go:25] "Validated CRI v1 image API" Apr 24 22:29:40.460603 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.460589 2582 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 22:29:40.464447 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.464426 2582 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 22:29:40.467627 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.467602 2582 fs.go:135] Filesystem UUIDs: map[621996e2-10fe-4951-878d-5684c3132f7b:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 ca0d6851-6f3c-4764-9db1-378b5a16d4c2:/dev/nvme0n1p4] Apr 24 22:29:40.467714 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.467626 2582 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 22:29:40.474087 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.473935 2582 manager.go:217] Machine: {Timestamp:2026-04-24 22:29:40.471786119 +0000 UTC m=+0.415577460 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098422 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2120092ac909a43bf7c2fa67ebb141 SystemUUID:ec212009-2ac9-09a4-3bf7-c2fa67ebb141 BootID:a0479fa9-2ace-4f03-9ca5-ed24bb5fe844 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:e3:d0:35:78:e5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:e3:d0:35:78:e5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5a:2e:b9:46:41:14 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 22:29:40.474087 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.474072 2582 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 22:29:40.474275 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.474202 2582 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 22:29:40.475486 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.475451 2582 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 22:29:40.475681 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.475490 2582 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-222.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 22:29:40.475775 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.475696 2582 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 22:29:40.475775 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.475709 2582 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 22:29:40.475775 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.475728 2582 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 22:29:40.476464 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.476451 2582 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 22:29:40.477733 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.477720 2582 state_mem.go:36] "Initialized new in-memory state store" Apr 24 22:29:40.477864 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.477854 2582 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 22:29:40.480594 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.480581 2582 kubelet.go:491] "Attempting to sync node with API server" Apr 24 22:29:40.480653 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.480608 2582 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 22:29:40.480653 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.480629 2582 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 22:29:40.480737 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.480716 2582 kubelet.go:397] "Adding apiserver pod source" Apr 24 22:29:40.480737 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.480729 2582 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 22:29:40.481897 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.481884 2582 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 22:29:40.481974 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.481907 2582 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 22:29:40.485106 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.485080 2582 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 22:29:40.486892 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.486878 2582 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 22:29:40.488336 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.488317 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 22:29:40.488393 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.488350 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 22:29:40.488393 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.488357 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 22:29:40.488393 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.488363 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 22:29:40.488393 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.488368 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 22:29:40.488393 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.488373 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 22:29:40.488393 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.488379 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 22:29:40.488393 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.488384 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 22:29:40.488393 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.488391 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 22:29:40.488393 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.488397 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 22:29:40.488622 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.488409 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 22:29:40.488622 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.488418 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 22:29:40.489337 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.489328 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 22:29:40.489337 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.489337 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 22:29:40.491920 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.491900 2582 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-926lc" Apr 24 22:29:40.493166 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.493146 2582 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 22:29:40.493251 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.493169 2582 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-222.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 22:29:40.493251 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.493193 2582 server.go:1295] "Started kubelet" Apr 24 22:29:40.493251 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:40.493198 2582 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 22:29:40.494274 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:40.494242 2582 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-222.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 22:29:40.495016 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.494939 2582 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 22:29:40.495082 ip-10-0-135-222 systemd[1]: Started Kubernetes Kubelet. Apr 24 22:29:40.495754 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.495721 2582 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 22:29:40.495853 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.495833 2582 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 22:29:40.498125 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.498107 2582 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 22:29:40.499146 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.499124 2582 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-926lc" Apr 24 22:29:40.499411 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.499396 2582 server.go:317] "Adding debug handlers to kubelet server" Apr 24 22:29:40.503639 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:40.502745 2582 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-222.ec2.internal.18a96b8ccf605dc9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-222.ec2.internal,UID:ip-10-0-135-222.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-222.ec2.internal,},FirstTimestamp:2026-04-24 22:29:40.493163977 +0000 UTC m=+0.436955300,LastTimestamp:2026-04-24 22:29:40.493163977 +0000 UTC m=+0.436955300,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-222.ec2.internal,}" Apr 24 22:29:40.505183 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.503936 2582 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 22:29:40.505183 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.505051 2582 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 22:29:40.505183 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.505106 2582 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 22:29:40.505376 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.505193 2582 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 22:29:40.505906 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.505636 2582 reconstruct.go:97] "Volume reconstruction finished" Apr 24 22:29:40.505906 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.505829 2582 reconciler.go:26] "Reconciler: start to sync state" Apr 24 22:29:40.505906 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.505847 2582 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 22:29:40.506715 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.506230 2582 factory.go:55] Registering systemd factory Apr 24 22:29:40.506715 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.506263 2582 factory.go:223] Registration of the systemd container factory successfully Apr 24 22:29:40.506715 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.506495 2582 factory.go:153] Registering CRI-O factory Apr 24 22:29:40.506715 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.506511 2582 factory.go:223] Registration of the crio container factory successfully Apr 24 22:29:40.506715 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.506601 2582 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 22:29:40.506715 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.506629 2582 factory.go:103] Registering Raw factory Apr 24 22:29:40.506715 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.506646 2582 manager.go:1196] Started watching for new ooms in manager Apr 24 22:29:40.506715 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:40.506667 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-222.ec2.internal\" not found" Apr 24 22:29:40.507171 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.507094 2582 manager.go:319] Starting recovery of all containers Apr 24 22:29:40.507525 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:40.507500 2582 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 22:29:40.516865 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.516846 2582 manager.go:324] Recovery completed Apr 24 22:29:40.517143 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.517125 2582 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:40.520539 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:40.520520 2582 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-135-222.ec2.internal\" not found" node="ip-10-0-135-222.ec2.internal" Apr 24 22:29:40.521880 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.521868 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:40.524829 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.524811 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-222.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:40.524918 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.524843 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-222.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:40.524918 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.524855 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-222.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:40.525397 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.525384 2582 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 22:29:40.525397 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.525396 2582 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 22:29:40.525519 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.525413 2582 state_mem.go:36] "Initialized new in-memory state store" Apr 24 22:29:40.527613 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.527595 2582 policy_none.go:49] "None policy: Start" Apr 24 22:29:40.527613 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.527614 2582 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 22:29:40.527722 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.527633 2582 state_mem.go:35] "Initializing new in-memory state store" Apr 24 22:29:40.566753 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.566724 2582 manager.go:341] "Starting Device Plugin manager" Apr 24 22:29:40.576158 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:40.566769 2582 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 22:29:40.576158 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.566784 2582 server.go:85] "Starting device plugin registration server" Apr 24 22:29:40.576158 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.567115 2582 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 22:29:40.576158 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.567130 2582 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 22:29:40.576158 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.567221 2582 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 22:29:40.576158 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.567306 2582 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 22:29:40.576158 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.567315 2582 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 22:29:40.576158 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:40.567935 2582 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 22:29:40.576158 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:40.567969 2582 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-222.ec2.internal\" not found" Apr 24 22:29:40.630194 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.630108 2582 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 22:29:40.631510 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.631492 2582 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 22:29:40.631586 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.631522 2582 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 22:29:40.631586 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.631545 2582 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 22:29:40.631586 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.631554 2582 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 22:29:40.631724 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:40.631596 2582 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 22:29:40.635135 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.635108 2582 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:40.667831 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.667788 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:40.668870 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.668848 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-222.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:40.668972 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.668884 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-222.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:40.668972 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.668895 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-222.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:40.668972 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.668927 2582 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-222.ec2.internal" Apr 24 22:29:40.677622 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.677602 2582 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-222.ec2.internal" Apr 24 22:29:40.677729 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:40.677631 2582 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-222.ec2.internal\": node \"ip-10-0-135-222.ec2.internal\" not found" Apr 24 22:29:40.708871 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:40.708844 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-222.ec2.internal\" not found" Apr 24 22:29:40.731714 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.731686 2582 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-222.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-222.ec2.internal"] Apr 24 22:29:40.731846 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.731764 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:40.732753 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.732735 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-222.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:40.732837 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.732769 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-222.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:40.732837 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.732779 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-222.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:40.734052 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.734038 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:40.734229 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.734212 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-222.ec2.internal" Apr 24 22:29:40.734287 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.734249 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:40.734953 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.734937 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-222.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:40.735047 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.734953 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-222.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:40.735047 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.734964 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-222.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:40.735047 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.734974 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-222.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:40.735047 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.734978 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-222.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:40.735047 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.735000 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-222.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:40.736100 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.736082 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-222.ec2.internal" Apr 24 22:29:40.736184 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.736113 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:40.736801 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.736786 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-222.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:40.736873 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.736827 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-222.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:40.736873 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.736842 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-222.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:40.760905 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:40.760879 2582 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-222.ec2.internal\" not found" node="ip-10-0-135-222.ec2.internal" Apr 24 22:29:40.765337 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:40.765316 2582 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-222.ec2.internal\" not found" node="ip-10-0-135-222.ec2.internal" Apr 24 22:29:40.807418 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.807388 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/404bdfcf13a36b9e1c667a45ad6916e1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-222.ec2.internal\" (UID: \"404bdfcf13a36b9e1c667a45ad6916e1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-222.ec2.internal" Apr 24 22:29:40.807418 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.807419 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/404bdfcf13a36b9e1c667a45ad6916e1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-222.ec2.internal\" (UID: \"404bdfcf13a36b9e1c667a45ad6916e1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-222.ec2.internal" Apr 24 22:29:40.807552 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.807441 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b28c4947c642c3a6ca28e65c847e2583-config\") pod \"kube-apiserver-proxy-ip-10-0-135-222.ec2.internal\" (UID: \"b28c4947c642c3a6ca28e65c847e2583\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-222.ec2.internal" Apr 24 22:29:40.809905 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:40.809888 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-222.ec2.internal\" not found" Apr 24 22:29:40.908087 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.908000 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/404bdfcf13a36b9e1c667a45ad6916e1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-222.ec2.internal\" (UID: \"404bdfcf13a36b9e1c667a45ad6916e1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-222.ec2.internal" Apr 24 22:29:40.908087 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.908040 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/404bdfcf13a36b9e1c667a45ad6916e1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-222.ec2.internal\" (UID: \"404bdfcf13a36b9e1c667a45ad6916e1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-222.ec2.internal" Apr 24 22:29:40.908087 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.908074 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b28c4947c642c3a6ca28e65c847e2583-config\") pod \"kube-apiserver-proxy-ip-10-0-135-222.ec2.internal\" (UID: \"b28c4947c642c3a6ca28e65c847e2583\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-222.ec2.internal" Apr 24 22:29:40.908259 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.908121 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/404bdfcf13a36b9e1c667a45ad6916e1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-222.ec2.internal\" (UID: \"404bdfcf13a36b9e1c667a45ad6916e1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-222.ec2.internal" Apr 24 22:29:40.908259 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.908132 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/404bdfcf13a36b9e1c667a45ad6916e1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-222.ec2.internal\" (UID: \"404bdfcf13a36b9e1c667a45ad6916e1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-222.ec2.internal" Apr 24 22:29:40.908259 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:40.908167 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b28c4947c642c3a6ca28e65c847e2583-config\") pod \"kube-apiserver-proxy-ip-10-0-135-222.ec2.internal\" (UID: \"b28c4947c642c3a6ca28e65c847e2583\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-222.ec2.internal" Apr 24 22:29:40.910082 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:40.910051 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-222.ec2.internal\" not found" Apr 24 22:29:41.010915 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:41.010884 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-222.ec2.internal\" not found" Apr 24 22:29:41.063131 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:41.063098 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-222.ec2.internal" Apr 24 22:29:41.067718 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:41.067695 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-222.ec2.internal" Apr 24 22:29:41.111551 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:41.111521 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-222.ec2.internal\" not found" Apr 24 22:29:41.212127 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:41.212038 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-222.ec2.internal\" not found" Apr 24 22:29:41.308233 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:41.308201 2582 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:41.312550 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:41.312524 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-222.ec2.internal\" not found" Apr 24 22:29:41.404886 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:41.404854 2582 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 22:29:41.405555 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:41.405017 2582 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 22:29:41.405555 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:41.405069 2582 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 22:29:41.405555 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:41.405079 2582 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 22:29:41.413265 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:41.413241 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-222.ec2.internal\" not found" Apr 24 22:29:41.502014 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:41.501924 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 22:24:40 +0000 UTC" deadline="2027-11-01 19:58:58.411985897 +0000 UTC" Apr 24 22:29:41.502014 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:41.501969 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13341h29m16.910020648s" Apr 24 22:29:41.506140 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:41.506115 2582 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 22:29:41.513312 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:41.513286 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-222.ec2.internal\" not found" Apr 24 22:29:41.516998 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:41.516973 2582 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 22:29:41.542122 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:41.542097 2582 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-hzjz5" Apr 24 22:29:41.550228 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:41.550203 2582 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-hzjz5" Apr 24 22:29:41.551545 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:41.551512 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod404bdfcf13a36b9e1c667a45ad6916e1.slice/crio-b616034de7ab0a350de5f83b14e2eaa928403b9bf91b19e23a008ab802e26d5e WatchSource:0}: Error finding container b616034de7ab0a350de5f83b14e2eaa928403b9bf91b19e23a008ab802e26d5e: Status 404 returned error can't find the container with id b616034de7ab0a350de5f83b14e2eaa928403b9bf91b19e23a008ab802e26d5e Apr 24 22:29:41.551773 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:41.551757 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb28c4947c642c3a6ca28e65c847e2583.slice/crio-0cbd8403393231e21ebe90a65476c5b8cfe1375ea61e9b668c745f6004ee34a2 WatchSource:0}: Error finding container 0cbd8403393231e21ebe90a65476c5b8cfe1375ea61e9b668c745f6004ee34a2: Status 404 returned error can't find the container with id 0cbd8403393231e21ebe90a65476c5b8cfe1375ea61e9b668c745f6004ee34a2 Apr 24 22:29:41.555930 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:41.555908 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:29:41.614377 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:41.614340 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-222.ec2.internal\" not found" Apr 24 22:29:41.634368 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:41.634307 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-222.ec2.internal" event={"ID":"404bdfcf13a36b9e1c667a45ad6916e1","Type":"ContainerStarted","Data":"b616034de7ab0a350de5f83b14e2eaa928403b9bf91b19e23a008ab802e26d5e"} Apr 24 22:29:41.635246 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:41.635225 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-222.ec2.internal" event={"ID":"b28c4947c642c3a6ca28e65c847e2583","Type":"ContainerStarted","Data":"0cbd8403393231e21ebe90a65476c5b8cfe1375ea61e9b668c745f6004ee34a2"} Apr 24 22:29:41.714806 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:41.714757 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-222.ec2.internal\" not found" Apr 24 22:29:41.815237 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:41.815205 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-222.ec2.internal\" not found" Apr 24 22:29:41.915710 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:41.915667 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-222.ec2.internal\" not found" Apr 24 22:29:42.011693 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.011510 2582 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:42.107281 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.106792 2582 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-222.ec2.internal" Apr 24 22:29:42.119700 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.119671 2582 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 22:29:42.120653 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.120626 2582 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-222.ec2.internal" Apr 24 22:29:42.128992 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.128964 2582 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 22:29:42.477176 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.477089 2582 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:42.482604 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.482580 2582 apiserver.go:52] "Watching apiserver" Apr 24 22:29:42.487729 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.487705 2582 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 22:29:42.489903 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.489878 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-222.ec2.internal","openshift-multus/network-metrics-daemon-hgvbb","openshift-network-diagnostics/network-check-target-7sw9z","openshift-ovn-kubernetes/ovnkube-node-dzsgr","kube-system/konnectivity-agent-kg2xf","kube-system/kube-apiserver-proxy-ip-10-0-135-222.ec2.internal","openshift-image-registry/node-ca-j4t6h","openshift-multus/multus-9wphl","openshift-multus/multus-additional-cni-plugins-z9q4w","openshift-network-operator/iptables-alerter-pqn8r","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk","openshift-cluster-node-tuning-operator/tuned-7pchr","openshift-dns/node-resolver-wtlmx"] Apr 24 22:29:42.492073 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.492037 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j4t6h" Apr 24 22:29:42.493168 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.493150 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:29:42.493264 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:42.493214 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgvbb" podUID="171d0bdf-1d87-4aee-9fad-9c28075596bd" Apr 24 22:29:42.493998 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.493976 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 22:29:42.494140 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.494017 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 22:29:42.494262 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.494248 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-qlvgm\"" Apr 24 22:29:42.494318 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.494283 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 22:29:42.494556 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.494540 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:29:42.494623 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:42.494605 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7sw9z" podUID="b5d8eefa-153f-46d6-8848-82778399a098" Apr 24 22:29:42.495825 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.495806 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.497029 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.497011 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-kg2xf" Apr 24 22:29:42.497343 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.497319 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 22:29:42.497521 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.497508 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 22:29:42.497690 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.497676 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 22:29:42.497772 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.497696 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-8lpxp\"" Apr 24 22:29:42.497971 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.497951 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 22:29:42.498357 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.498261 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.498974 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.498956 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 22:29:42.499175 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.499028 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 22:29:42.499175 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.499157 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 22:29:42.499331 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.499294 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-582p9\"" Apr 24 22:29:42.500189 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.499540 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 22:29:42.500189 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.499592 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.500189 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.499989 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 22:29:42.500189 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.500049 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 22:29:42.500189 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.500107 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 22:29:42.500189 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.500138 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-wdlhb\"" Apr 24 22:29:42.500511 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.500234 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 22:29:42.501302 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.500860 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-pqn8r" Apr 24 22:29:42.502371 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.502351 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" Apr 24 22:29:42.504095 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.504052 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:29:42.504270 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.504247 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-tlrrk\"" Apr 24 22:29:42.505261 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.504442 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 22:29:42.505261 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.504856 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 22:29:42.505261 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.504996 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 22:29:42.505261 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.504996 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-29drs\"" Apr 24 22:29:42.505261 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.505124 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 22:29:42.505261 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.505139 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 22:29:42.505552 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.505378 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 22:29:42.505652 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.505635 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-j87cq\"" Apr 24 22:29:42.506787 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.505926 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 22:29:42.507328 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.507309 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.509279 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.509255 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 22:29:42.509387 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.509260 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:29:42.510450 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.510022 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wtlmx" Apr 24 22:29:42.510450 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.510447 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-pqpxh\"" Apr 24 22:29:42.512236 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.512210 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 22:29:42.512337 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.512318 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-bcxh4\"" Apr 24 22:29:42.512392 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.512382 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 22:29:42.516375 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.516353 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-host-cni-bin\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.516478 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.516391 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zt5v\" (UniqueName: \"kubernetes.io/projected/239c26d8-bd64-4f99-9455-4fceceb609ee-kube-api-access-4zt5v\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.516478 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.516420 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dvk5\" (UniqueName: \"kubernetes.io/projected/7cf45f35-3263-4dd4-83bf-caaac71acebd-kube-api-access-2dvk5\") pod \"iptables-alerter-pqn8r\" (UID: \"7cf45f35-3263-4dd4-83bf-caaac71acebd\") " pod="openshift-network-operator/iptables-alerter-pqn8r" Apr 24 22:29:42.516478 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.516436 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78z27\" (UniqueName: \"kubernetes.io/projected/171d0bdf-1d87-4aee-9fad-9c28075596bd-kube-api-access-78z27\") pod \"network-metrics-daemon-hgvbb\" (UID: \"171d0bdf-1d87-4aee-9fad-9c28075596bd\") " pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:29:42.516478 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.516451 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-var-lib-openvswitch\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.516478 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.516466 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-multus-daemon-config\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.516695 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.516489 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9de1a8ab-60ad-4326-9262-cd5c6afff9fc-device-dir\") pod \"aws-ebs-csi-driver-node-8g4nk\" (UID: \"9de1a8ab-60ad-4326-9262-cd5c6afff9fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" Apr 24 22:29:42.516695 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.516511 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-etc-sysctl-conf\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.516695 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.516545 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-host\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.516695 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.516573 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/239c26d8-bd64-4f99-9455-4fceceb609ee-env-overrides\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.516695 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.516602 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/44759cca-eeb7-4b34-af4d-65cef31d60a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z9q4w\" (UID: \"44759cca-eeb7-4b34-af4d-65cef31d60a1\") " pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.516695 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.516626 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/44759cca-eeb7-4b34-af4d-65cef31d60a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z9q4w\" (UID: \"44759cca-eeb7-4b34-af4d-65cef31d60a1\") " pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.516695 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.516650 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-multus-cni-dir\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.516695 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.516672 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-cni-binary-copy\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.516979 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.516692 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9de1a8ab-60ad-4326-9262-cd5c6afff9fc-registration-dir\") pod \"aws-ebs-csi-driver-node-8g4nk\" (UID: \"9de1a8ab-60ad-4326-9262-cd5c6afff9fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" Apr 24 22:29:42.516979 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.516723 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-node-log\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.516979 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.516752 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-host-slash\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.516979 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.516785 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-host-run-netns\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.516979 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.516823 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7cf45f35-3263-4dd4-83bf-caaac71acebd-iptables-alerter-script\") pod \"iptables-alerter-pqn8r\" (UID: \"7cf45f35-3263-4dd4-83bf-caaac71acebd\") " pod="openshift-network-operator/iptables-alerter-pqn8r" Apr 24 22:29:42.516979 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.516855 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.516979 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.516891 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-var-lib-kubelet\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.517319 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517029 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/44759cca-eeb7-4b34-af4d-65cef31d60a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-z9q4w\" (UID: \"44759cca-eeb7-4b34-af4d-65cef31d60a1\") " pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.517319 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517090 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-multus-conf-dir\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.517319 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517134 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f4851636-e409-4338-9170-49d3547b7af4-etc-tuned\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.517319 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517189 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/239c26d8-bd64-4f99-9455-4fceceb609ee-ovnkube-script-lib\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.517319 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517224 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8w2b\" (UniqueName: \"kubernetes.io/projected/4f665f36-3e6e-4199-bbcd-df474abfeb86-kube-api-access-w8w2b\") pod \"node-ca-j4t6h\" (UID: \"4f665f36-3e6e-4199-bbcd-df474abfeb86\") " pod="openshift-image-registry/node-ca-j4t6h" Apr 24 22:29:42.517319 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517255 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl4s7\" (UniqueName: \"kubernetes.io/projected/9de1a8ab-60ad-4326-9262-cd5c6afff9fc-kube-api-access-hl4s7\") pod \"aws-ebs-csi-driver-node-8g4nk\" (UID: \"9de1a8ab-60ad-4326-9262-cd5c6afff9fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" Apr 24 22:29:42.517319 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517280 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-etc-modprobe-d\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.517319 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517307 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt28x\" (UniqueName: \"kubernetes.io/projected/f4851636-e409-4338-9170-49d3547b7af4-kube-api-access-tt28x\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.517679 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517330 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-host-kubelet\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.517679 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517355 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-host-cni-netd\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.517679 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517379 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3c9ae31a-f5e8-444b-8692-a6e8b24d04ad-agent-certs\") pod \"konnectivity-agent-kg2xf\" (UID: \"3c9ae31a-f5e8-444b-8692-a6e8b24d04ad\") " pod="kube-system/konnectivity-agent-kg2xf" Apr 24 22:29:42.517679 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517402 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-etc-kubernetes\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.517679 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517435 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-cnibin\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.517679 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517478 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-host-var-lib-cni-bin\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.517679 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517505 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-host-run-multus-certs\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.517679 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517529 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9de1a8ab-60ad-4326-9262-cd5c6afff9fc-socket-dir\") pod \"aws-ebs-csi-driver-node-8g4nk\" (UID: \"9de1a8ab-60ad-4326-9262-cd5c6afff9fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" Apr 24 22:29:42.517679 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517551 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-sys\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.517679 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517577 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-host-var-lib-kubelet\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.517679 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517603 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vznvq\" (UniqueName: \"kubernetes.io/projected/b5d8eefa-153f-46d6-8848-82778399a098-kube-api-access-vznvq\") pod \"network-check-target-7sw9z\" (UID: \"b5d8eefa-153f-46d6-8848-82778399a098\") " pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:29:42.517679 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517649 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-log-socket\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.518227 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517681 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/44759cca-eeb7-4b34-af4d-65cef31d60a1-os-release\") pod \"multus-additional-cni-plugins-z9q4w\" (UID: \"44759cca-eeb7-4b34-af4d-65cef31d60a1\") " pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.518227 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517706 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-etc-sysctl-d\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.518227 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517724 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-etc-systemd\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.518227 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517739 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-etc-openvswitch\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.518227 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517771 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-host-run-ovn-kubernetes\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.518227 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517802 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/44759cca-eeb7-4b34-af4d-65cef31d60a1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-z9q4w\" (UID: \"44759cca-eeb7-4b34-af4d-65cef31d60a1\") " pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.518227 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517828 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7cf45f35-3263-4dd4-83bf-caaac71acebd-host-slash\") pod \"iptables-alerter-pqn8r\" (UID: \"7cf45f35-3263-4dd4-83bf-caaac71acebd\") " pod="openshift-network-operator/iptables-alerter-pqn8r" Apr 24 22:29:42.518227 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517854 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4f665f36-3e6e-4199-bbcd-df474abfeb86-serviceca\") pod \"node-ca-j4t6h\" (UID: \"4f665f36-3e6e-4199-bbcd-df474abfeb86\") " pod="openshift-image-registry/node-ca-j4t6h" Apr 24 22:29:42.518227 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517887 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3c9ae31a-f5e8-444b-8692-a6e8b24d04ad-konnectivity-ca\") pod \"konnectivity-agent-kg2xf\" (UID: \"3c9ae31a-f5e8-444b-8692-a6e8b24d04ad\") " pod="kube-system/konnectivity-agent-kg2xf" Apr 24 22:29:42.518227 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517912 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-host-run-netns\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.518227 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517949 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44759cca-eeb7-4b34-af4d-65cef31d60a1-system-cni-dir\") pod \"multus-additional-cni-plugins-z9q4w\" (UID: \"44759cca-eeb7-4b34-af4d-65cef31d60a1\") " pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.518227 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517974 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ltpj\" (UniqueName: \"kubernetes.io/projected/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-kube-api-access-9ltpj\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.518227 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.517997 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9de1a8ab-60ad-4326-9262-cd5c6afff9fc-etc-selinux\") pod \"aws-ebs-csi-driver-node-8g4nk\" (UID: \"9de1a8ab-60ad-4326-9262-cd5c6afff9fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" Apr 24 22:29:42.518227 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.518038 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9de1a8ab-60ad-4326-9262-cd5c6afff9fc-sys-fs\") pod \"aws-ebs-csi-driver-node-8g4nk\" (UID: \"9de1a8ab-60ad-4326-9262-cd5c6afff9fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" Apr 24 22:29:42.518227 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.518085 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-etc-sysconfig\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.518227 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.518120 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f4851636-e409-4338-9170-49d3547b7af4-tmp\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.518861 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.518143 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-systemd-units\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.518861 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.518169 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/239c26d8-bd64-4f99-9455-4fceceb609ee-ovnkube-config\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.518861 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.518192 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t748r\" (UniqueName: \"kubernetes.io/projected/44759cca-eeb7-4b34-af4d-65cef31d60a1-kube-api-access-t748r\") pod \"multus-additional-cni-plugins-z9q4w\" (UID: \"44759cca-eeb7-4b34-af4d-65cef31d60a1\") " pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.518861 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.518216 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-system-cni-dir\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.518861 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.518251 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-hostroot\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.518861 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.518276 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-etc-kubernetes\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.518861 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.518297 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-lib-modules\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.518861 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.518322 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f665f36-3e6e-4199-bbcd-df474abfeb86-host\") pod \"node-ca-j4t6h\" (UID: \"4f665f36-3e6e-4199-bbcd-df474abfeb86\") " pod="openshift-image-registry/node-ca-j4t6h" Apr 24 22:29:42.518861 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.518345 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-multus-socket-dir-parent\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.518861 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.518374 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-host-run-k8s-cni-cncf-io\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.518861 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.518402 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9de1a8ab-60ad-4326-9262-cd5c6afff9fc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8g4nk\" (UID: \"9de1a8ab-60ad-4326-9262-cd5c6afff9fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" Apr 24 22:29:42.518861 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.518424 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-run\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.518861 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.518483 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/239c26d8-bd64-4f99-9455-4fceceb609ee-ovn-node-metrics-cert\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.518861 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.518524 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/44759cca-eeb7-4b34-af4d-65cef31d60a1-cnibin\") pod \"multus-additional-cni-plugins-z9q4w\" (UID: \"44759cca-eeb7-4b34-af4d-65cef31d60a1\") " pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.518861 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.518549 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-os-release\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.518861 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.518575 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-host-var-lib-cni-multus\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.518861 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.518608 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/171d0bdf-1d87-4aee-9fad-9c28075596bd-metrics-certs\") pod \"network-metrics-daemon-hgvbb\" (UID: \"171d0bdf-1d87-4aee-9fad-9c28075596bd\") " pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:29:42.519554 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.518645 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-run-systemd\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.519554 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.518671 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-run-openvswitch\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.519554 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.518717 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-run-ovn\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.551209 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.551172 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 22:24:41 +0000 UTC" deadline="2028-01-13 14:02:14.488593858 +0000 UTC" Apr 24 22:29:42.551209 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.551205 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15087h32m31.937391553s" Apr 24 22:29:42.607329 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.607297 2582 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 22:29:42.618920 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.618862 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qrs6\" (UniqueName: \"kubernetes.io/projected/722d1910-3c2f-4e70-af24-daf9f78fcf06-kube-api-access-7qrs6\") pod \"node-resolver-wtlmx\" (UID: \"722d1910-3c2f-4e70-af24-daf9f78fcf06\") " pod="openshift-dns/node-resolver-wtlmx" Apr 24 22:29:42.618920 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.618918 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-etc-openvswitch\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.618920 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.618945 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-host-run-ovn-kubernetes\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.619258 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.618970 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/44759cca-eeb7-4b34-af4d-65cef31d60a1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-z9q4w\" (UID: \"44759cca-eeb7-4b34-af4d-65cef31d60a1\") " pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.619258 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.618989 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-etc-openvswitch\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.619258 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619011 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7cf45f35-3263-4dd4-83bf-caaac71acebd-host-slash\") pod \"iptables-alerter-pqn8r\" (UID: \"7cf45f35-3263-4dd4-83bf-caaac71acebd\") " pod="openshift-network-operator/iptables-alerter-pqn8r" Apr 24 22:29:42.619258 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619052 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4f665f36-3e6e-4199-bbcd-df474abfeb86-serviceca\") pod \"node-ca-j4t6h\" (UID: \"4f665f36-3e6e-4199-bbcd-df474abfeb86\") " pod="openshift-image-registry/node-ca-j4t6h" Apr 24 22:29:42.619258 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619087 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-host-run-ovn-kubernetes\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.619258 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619097 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3c9ae31a-f5e8-444b-8692-a6e8b24d04ad-konnectivity-ca\") pod \"konnectivity-agent-kg2xf\" (UID: \"3c9ae31a-f5e8-444b-8692-a6e8b24d04ad\") " pod="kube-system/konnectivity-agent-kg2xf" Apr 24 22:29:42.619258 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619050 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7cf45f35-3263-4dd4-83bf-caaac71acebd-host-slash\") pod \"iptables-alerter-pqn8r\" (UID: \"7cf45f35-3263-4dd4-83bf-caaac71acebd\") " pod="openshift-network-operator/iptables-alerter-pqn8r" Apr 24 22:29:42.619258 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619151 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-host-run-netns\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.619258 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619199 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-host-run-netns\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.619258 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619236 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44759cca-eeb7-4b34-af4d-65cef31d60a1-system-cni-dir\") pod \"multus-additional-cni-plugins-z9q4w\" (UID: \"44759cca-eeb7-4b34-af4d-65cef31d60a1\") " pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.619684 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619266 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ltpj\" (UniqueName: \"kubernetes.io/projected/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-kube-api-access-9ltpj\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.619684 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619290 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9de1a8ab-60ad-4326-9262-cd5c6afff9fc-etc-selinux\") pod \"aws-ebs-csi-driver-node-8g4nk\" (UID: \"9de1a8ab-60ad-4326-9262-cd5c6afff9fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" Apr 24 22:29:42.619684 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619314 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9de1a8ab-60ad-4326-9262-cd5c6afff9fc-sys-fs\") pod \"aws-ebs-csi-driver-node-8g4nk\" (UID: \"9de1a8ab-60ad-4326-9262-cd5c6afff9fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" Apr 24 22:29:42.619684 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619326 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44759cca-eeb7-4b34-af4d-65cef31d60a1-system-cni-dir\") pod \"multus-additional-cni-plugins-z9q4w\" (UID: \"44759cca-eeb7-4b34-af4d-65cef31d60a1\") " pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.619684 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619336 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-etc-sysconfig\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.619684 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619377 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-etc-sysconfig\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.619684 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619385 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f4851636-e409-4338-9170-49d3547b7af4-tmp\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.619684 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619419 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/722d1910-3c2f-4e70-af24-daf9f78fcf06-hosts-file\") pod \"node-resolver-wtlmx\" (UID: \"722d1910-3c2f-4e70-af24-daf9f78fcf06\") " pod="openshift-dns/node-resolver-wtlmx" Apr 24 22:29:42.619684 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619426 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9de1a8ab-60ad-4326-9262-cd5c6afff9fc-etc-selinux\") pod \"aws-ebs-csi-driver-node-8g4nk\" (UID: \"9de1a8ab-60ad-4326-9262-cd5c6afff9fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" Apr 24 22:29:42.619684 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619452 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-systemd-units\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.619684 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619476 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/239c26d8-bd64-4f99-9455-4fceceb609ee-ovnkube-config\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.619684 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619512 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t748r\" (UniqueName: \"kubernetes.io/projected/44759cca-eeb7-4b34-af4d-65cef31d60a1-kube-api-access-t748r\") pod \"multus-additional-cni-plugins-z9q4w\" (UID: \"44759cca-eeb7-4b34-af4d-65cef31d60a1\") " pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.619684 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619538 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-system-cni-dir\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.619684 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619542 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4f665f36-3e6e-4199-bbcd-df474abfeb86-serviceca\") pod \"node-ca-j4t6h\" (UID: \"4f665f36-3e6e-4199-bbcd-df474abfeb86\") " pod="openshift-image-registry/node-ca-j4t6h" Apr 24 22:29:42.619684 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619561 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-hostroot\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.619684 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619585 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-etc-kubernetes\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.619684 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619601 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-systemd-units\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.620544 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619610 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-lib-modules\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.620544 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619642 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f665f36-3e6e-4199-bbcd-df474abfeb86-host\") pod \"node-ca-j4t6h\" (UID: \"4f665f36-3e6e-4199-bbcd-df474abfeb86\") " pod="openshift-image-registry/node-ca-j4t6h" Apr 24 22:29:42.620544 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619649 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9de1a8ab-60ad-4326-9262-cd5c6afff9fc-sys-fs\") pod \"aws-ebs-csi-driver-node-8g4nk\" (UID: \"9de1a8ab-60ad-4326-9262-cd5c6afff9fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" Apr 24 22:29:42.620544 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619652 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3c9ae31a-f5e8-444b-8692-a6e8b24d04ad-konnectivity-ca\") pod \"konnectivity-agent-kg2xf\" (UID: \"3c9ae31a-f5e8-444b-8692-a6e8b24d04ad\") " pod="kube-system/konnectivity-agent-kg2xf" Apr 24 22:29:42.620544 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619667 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-system-cni-dir\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.620544 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619669 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/44759cca-eeb7-4b34-af4d-65cef31d60a1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-z9q4w\" (UID: \"44759cca-eeb7-4b34-af4d-65cef31d60a1\") " pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.620544 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619667 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-multus-socket-dir-parent\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.620544 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619709 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-multus-socket-dir-parent\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.620544 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619712 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-host-run-k8s-cni-cncf-io\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.620544 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619711 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-hostroot\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.620544 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619720 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f665f36-3e6e-4199-bbcd-df474abfeb86-host\") pod \"node-ca-j4t6h\" (UID: \"4f665f36-3e6e-4199-bbcd-df474abfeb86\") " pod="openshift-image-registry/node-ca-j4t6h" Apr 24 22:29:42.620544 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619725 2582 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 22:29:42.620544 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619744 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9de1a8ab-60ad-4326-9262-cd5c6afff9fc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8g4nk\" (UID: \"9de1a8ab-60ad-4326-9262-cd5c6afff9fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" Apr 24 22:29:42.620544 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619754 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-host-run-k8s-cni-cncf-io\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.620544 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619767 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-run\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.620544 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619746 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-etc-kubernetes\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.620544 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619792 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/239c26d8-bd64-4f99-9455-4fceceb609ee-ovn-node-metrics-cert\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.620544 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619802 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9de1a8ab-60ad-4326-9262-cd5c6afff9fc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8g4nk\" (UID: \"9de1a8ab-60ad-4326-9262-cd5c6afff9fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" Apr 24 22:29:42.621045 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619813 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/44759cca-eeb7-4b34-af4d-65cef31d60a1-cnibin\") pod \"multus-additional-cni-plugins-z9q4w\" (UID: \"44759cca-eeb7-4b34-af4d-65cef31d60a1\") " pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.621045 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619826 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-run\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.621045 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619849 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/44759cca-eeb7-4b34-af4d-65cef31d60a1-cnibin\") pod \"multus-additional-cni-plugins-z9q4w\" (UID: \"44759cca-eeb7-4b34-af4d-65cef31d60a1\") " pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.621045 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619852 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-lib-modules\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.621045 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619891 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-os-release\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.621045 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619920 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-host-var-lib-cni-multus\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.621045 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619956 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/171d0bdf-1d87-4aee-9fad-9c28075596bd-metrics-certs\") pod \"network-metrics-daemon-hgvbb\" (UID: \"171d0bdf-1d87-4aee-9fad-9c28075596bd\") " pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:29:42.621045 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619974 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-os-release\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.621045 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.619992 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-run-systemd\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.621045 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620014 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-run-openvswitch\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.621045 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620022 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-host-var-lib-cni-multus\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.621045 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620038 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-run-ovn\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.621045 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620044 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-run-systemd\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.621045 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620081 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-host-cni-bin\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.621045 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620107 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zt5v\" (UniqueName: \"kubernetes.io/projected/239c26d8-bd64-4f99-9455-4fceceb609ee-kube-api-access-4zt5v\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.621045 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620083 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-run-openvswitch\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.621045 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620119 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-host-cni-bin\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.621045 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:42.620124 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:42.621868 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620109 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-run-ovn\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.621868 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620131 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2dvk5\" (UniqueName: \"kubernetes.io/projected/7cf45f35-3263-4dd4-83bf-caaac71acebd-kube-api-access-2dvk5\") pod \"iptables-alerter-pqn8r\" (UID: \"7cf45f35-3263-4dd4-83bf-caaac71acebd\") " pod="openshift-network-operator/iptables-alerter-pqn8r" Apr 24 22:29:42.621868 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620170 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78z27\" (UniqueName: \"kubernetes.io/projected/171d0bdf-1d87-4aee-9fad-9c28075596bd-kube-api-access-78z27\") pod \"network-metrics-daemon-hgvbb\" (UID: \"171d0bdf-1d87-4aee-9fad-9c28075596bd\") " pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:29:42.621868 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620166 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/239c26d8-bd64-4f99-9455-4fceceb609ee-ovnkube-config\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.621868 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:42.620195 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/171d0bdf-1d87-4aee-9fad-9c28075596bd-metrics-certs podName:171d0bdf-1d87-4aee-9fad-9c28075596bd nodeName:}" failed. No retries permitted until 2026-04-24 22:29:43.120173509 +0000 UTC m=+3.063964832 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/171d0bdf-1d87-4aee-9fad-9c28075596bd-metrics-certs") pod "network-metrics-daemon-hgvbb" (UID: "171d0bdf-1d87-4aee-9fad-9c28075596bd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:42.621868 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620224 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-var-lib-openvswitch\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.621868 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620248 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-multus-daemon-config\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.621868 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620272 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9de1a8ab-60ad-4326-9262-cd5c6afff9fc-device-dir\") pod \"aws-ebs-csi-driver-node-8g4nk\" (UID: \"9de1a8ab-60ad-4326-9262-cd5c6afff9fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" Apr 24 22:29:42.621868 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620279 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-var-lib-openvswitch\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.621868 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620300 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-etc-sysctl-conf\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.621868 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620337 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-host\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.621868 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620387 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/239c26d8-bd64-4f99-9455-4fceceb609ee-env-overrides\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.621868 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620421 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/44759cca-eeb7-4b34-af4d-65cef31d60a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z9q4w\" (UID: \"44759cca-eeb7-4b34-af4d-65cef31d60a1\") " pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.621868 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620427 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-etc-sysctl-conf\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.621868 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620449 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/44759cca-eeb7-4b34-af4d-65cef31d60a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z9q4w\" (UID: \"44759cca-eeb7-4b34-af4d-65cef31d60a1\") " pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.621868 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620460 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-host\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.622682 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620467 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9de1a8ab-60ad-4326-9262-cd5c6afff9fc-device-dir\") pod \"aws-ebs-csi-driver-node-8g4nk\" (UID: \"9de1a8ab-60ad-4326-9262-cd5c6afff9fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" Apr 24 22:29:42.622682 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620481 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-multus-cni-dir\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.622682 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620510 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-cni-binary-copy\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.622682 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620532 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9de1a8ab-60ad-4326-9262-cd5c6afff9fc-registration-dir\") pod \"aws-ebs-csi-driver-node-8g4nk\" (UID: \"9de1a8ab-60ad-4326-9262-cd5c6afff9fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" Apr 24 22:29:42.622682 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620557 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-node-log\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.622682 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620583 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-host-slash\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.622682 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620604 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-host-run-netns\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.622682 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620604 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/44759cca-eeb7-4b34-af4d-65cef31d60a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z9q4w\" (UID: \"44759cca-eeb7-4b34-af4d-65cef31d60a1\") " pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.622682 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620652 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9de1a8ab-60ad-4326-9262-cd5c6afff9fc-registration-dir\") pod \"aws-ebs-csi-driver-node-8g4nk\" (UID: \"9de1a8ab-60ad-4326-9262-cd5c6afff9fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" Apr 24 22:29:42.622682 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620668 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-node-log\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.622682 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620682 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7cf45f35-3263-4dd4-83bf-caaac71acebd-iptables-alerter-script\") pod \"iptables-alerter-pqn8r\" (UID: \"7cf45f35-3263-4dd4-83bf-caaac71acebd\") " pod="openshift-network-operator/iptables-alerter-pqn8r" Apr 24 22:29:42.622682 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620712 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.622682 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620740 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-var-lib-kubelet\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.622682 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620750 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-host-run-netns\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.622682 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620767 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/722d1910-3c2f-4e70-af24-daf9f78fcf06-tmp-dir\") pod \"node-resolver-wtlmx\" (UID: \"722d1910-3c2f-4e70-af24-daf9f78fcf06\") " pod="openshift-dns/node-resolver-wtlmx" Apr 24 22:29:42.622682 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620793 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/239c26d8-bd64-4f99-9455-4fceceb609ee-env-overrides\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.622682 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620709 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-host-slash\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.623541 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620813 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-multus-cni-dir\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.623541 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620848 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-var-lib-kubelet\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.623541 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620795 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/44759cca-eeb7-4b34-af4d-65cef31d60a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-z9q4w\" (UID: \"44759cca-eeb7-4b34-af4d-65cef31d60a1\") " pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.623541 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620890 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-multus-conf-dir\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.623541 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620917 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f4851636-e409-4338-9170-49d3547b7af4-etc-tuned\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.623541 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620941 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/239c26d8-bd64-4f99-9455-4fceceb609ee-ovnkube-script-lib\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.623541 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620965 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8w2b\" (UniqueName: \"kubernetes.io/projected/4f665f36-3e6e-4199-bbcd-df474abfeb86-kube-api-access-w8w2b\") pod \"node-ca-j4t6h\" (UID: \"4f665f36-3e6e-4199-bbcd-df474abfeb86\") " pod="openshift-image-registry/node-ca-j4t6h" Apr 24 22:29:42.623541 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.620988 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hl4s7\" (UniqueName: \"kubernetes.io/projected/9de1a8ab-60ad-4326-9262-cd5c6afff9fc-kube-api-access-hl4s7\") pod \"aws-ebs-csi-driver-node-8g4nk\" (UID: \"9de1a8ab-60ad-4326-9262-cd5c6afff9fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" Apr 24 22:29:42.623541 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621011 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-etc-modprobe-d\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.623541 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621037 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tt28x\" (UniqueName: \"kubernetes.io/projected/f4851636-e409-4338-9170-49d3547b7af4-kube-api-access-tt28x\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.623541 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621085 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-host-kubelet\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.623541 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621110 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-host-cni-netd\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.623541 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621133 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3c9ae31a-f5e8-444b-8692-a6e8b24d04ad-agent-certs\") pod \"konnectivity-agent-kg2xf\" (UID: \"3c9ae31a-f5e8-444b-8692-a6e8b24d04ad\") " pod="kube-system/konnectivity-agent-kg2xf" Apr 24 22:29:42.623541 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621141 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/44759cca-eeb7-4b34-af4d-65cef31d60a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z9q4w\" (UID: \"44759cca-eeb7-4b34-af4d-65cef31d60a1\") " pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.623541 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621159 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-etc-kubernetes\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.623541 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621184 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-cnibin\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.623541 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621209 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-host-var-lib-cni-bin\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.624315 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621220 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-host-kubelet\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.624315 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621235 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-host-run-multus-certs\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.624315 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621260 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9de1a8ab-60ad-4326-9262-cd5c6afff9fc-socket-dir\") pod \"aws-ebs-csi-driver-node-8g4nk\" (UID: \"9de1a8ab-60ad-4326-9262-cd5c6afff9fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" Apr 24 22:29:42.624315 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621274 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-etc-kubernetes\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.624315 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621287 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-sys\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.624315 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621297 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-cni-binary-copy\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.624315 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621314 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-host-var-lib-kubelet\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.624315 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621324 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.624315 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621335 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-etc-modprobe-d\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.624315 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621340 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vznvq\" (UniqueName: \"kubernetes.io/projected/b5d8eefa-153f-46d6-8848-82778399a098-kube-api-access-vznvq\") pod \"network-check-target-7sw9z\" (UID: \"b5d8eefa-153f-46d6-8848-82778399a098\") " pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:29:42.624315 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621286 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/44759cca-eeb7-4b34-af4d-65cef31d60a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-z9q4w\" (UID: \"44759cca-eeb7-4b34-af4d-65cef31d60a1\") " pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.624315 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621381 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-multus-conf-dir\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.624315 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621405 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-cnibin\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.624315 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621408 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-host-var-lib-kubelet\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.624315 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621418 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-sys\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.624315 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621443 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-host-var-lib-cni-bin\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.624315 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621455 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-host-run-multus-certs\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.624315 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621456 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-log-socket\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.625184 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621494 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/44759cca-eeb7-4b34-af4d-65cef31d60a1-os-release\") pod \"multus-additional-cni-plugins-z9q4w\" (UID: \"44759cca-eeb7-4b34-af4d-65cef31d60a1\") " pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.625184 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621527 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/44759cca-eeb7-4b34-af4d-65cef31d60a1-os-release\") pod \"multus-additional-cni-plugins-z9q4w\" (UID: \"44759cca-eeb7-4b34-af4d-65cef31d60a1\") " pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.625184 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621565 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-etc-sysctl-d\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.625184 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621590 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-etc-systemd\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.625184 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621678 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-etc-systemd\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.625184 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621679 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7cf45f35-3263-4dd4-83bf-caaac71acebd-iptables-alerter-script\") pod \"iptables-alerter-pqn8r\" (UID: \"7cf45f35-3263-4dd4-83bf-caaac71acebd\") " pod="openshift-network-operator/iptables-alerter-pqn8r" Apr 24 22:29:42.625184 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621701 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-log-socket\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.625184 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621757 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9de1a8ab-60ad-4326-9262-cd5c6afff9fc-socket-dir\") pod \"aws-ebs-csi-driver-node-8g4nk\" (UID: \"9de1a8ab-60ad-4326-9262-cd5c6afff9fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" Apr 24 22:29:42.625184 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621825 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f4851636-e409-4338-9170-49d3547b7af4-etc-sysctl-d\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.625184 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621851 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/239c26d8-bd64-4f99-9455-4fceceb609ee-host-cni-netd\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.625184 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.621905 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/239c26d8-bd64-4f99-9455-4fceceb609ee-ovnkube-script-lib\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.625184 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.622014 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-multus-daemon-config\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.625184 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.623556 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/239c26d8-bd64-4f99-9455-4fceceb609ee-ovn-node-metrics-cert\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.625184 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.624217 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f4851636-e409-4338-9170-49d3547b7af4-tmp\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.625184 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.624284 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f4851636-e409-4338-9170-49d3547b7af4-etc-tuned\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.628601 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.628543 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ltpj\" (UniqueName: \"kubernetes.io/projected/d4b8fbfd-f18c-4b29-9c01-547311bd0ba6-kube-api-access-9ltpj\") pod \"multus-9wphl\" (UID: \"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6\") " pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.629673 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.629652 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3c9ae31a-f5e8-444b-8692-a6e8b24d04ad-agent-certs\") pod \"konnectivity-agent-kg2xf\" (UID: \"3c9ae31a-f5e8-444b-8692-a6e8b24d04ad\") " pod="kube-system/konnectivity-agent-kg2xf" Apr 24 22:29:42.630394 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.630372 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t748r\" (UniqueName: \"kubernetes.io/projected/44759cca-eeb7-4b34-af4d-65cef31d60a1-kube-api-access-t748r\") pod \"multus-additional-cni-plugins-z9q4w\" (UID: \"44759cca-eeb7-4b34-af4d-65cef31d60a1\") " pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.630668 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.630645 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zt5v\" (UniqueName: \"kubernetes.io/projected/239c26d8-bd64-4f99-9455-4fceceb609ee-kube-api-access-4zt5v\") pod \"ovnkube-node-dzsgr\" (UID: \"239c26d8-bd64-4f99-9455-4fceceb609ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.631314 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.631294 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dvk5\" (UniqueName: \"kubernetes.io/projected/7cf45f35-3263-4dd4-83bf-caaac71acebd-kube-api-access-2dvk5\") pod \"iptables-alerter-pqn8r\" (UID: \"7cf45f35-3263-4dd4-83bf-caaac71acebd\") " pod="openshift-network-operator/iptables-alerter-pqn8r" Apr 24 22:29:42.633566 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.633543 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt28x\" (UniqueName: \"kubernetes.io/projected/f4851636-e409-4338-9170-49d3547b7af4-kube-api-access-tt28x\") pod \"tuned-7pchr\" (UID: \"f4851636-e409-4338-9170-49d3547b7af4\") " pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.633676 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.633619 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8w2b\" (UniqueName: \"kubernetes.io/projected/4f665f36-3e6e-4199-bbcd-df474abfeb86-kube-api-access-w8w2b\") pod \"node-ca-j4t6h\" (UID: \"4f665f36-3e6e-4199-bbcd-df474abfeb86\") " pod="openshift-image-registry/node-ca-j4t6h" Apr 24 22:29:42.634037 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:42.634022 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:42.634037 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:42.634040 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:42.634327 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:42.634052 2582 projected.go:194] Error preparing data for projected volume kube-api-access-vznvq for pod openshift-network-diagnostics/network-check-target-7sw9z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:42.634433 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:42.634399 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b5d8eefa-153f-46d6-8848-82778399a098-kube-api-access-vznvq podName:b5d8eefa-153f-46d6-8848-82778399a098 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:43.134366248 +0000 UTC m=+3.078157580 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-vznvq" (UniqueName: "kubernetes.io/projected/b5d8eefa-153f-46d6-8848-82778399a098-kube-api-access-vznvq") pod "network-check-target-7sw9z" (UID: "b5d8eefa-153f-46d6-8848-82778399a098") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:42.636077 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.636032 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78z27\" (UniqueName: \"kubernetes.io/projected/171d0bdf-1d87-4aee-9fad-9c28075596bd-kube-api-access-78z27\") pod \"network-metrics-daemon-hgvbb\" (UID: \"171d0bdf-1d87-4aee-9fad-9c28075596bd\") " pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:29:42.637125 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.637104 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl4s7\" (UniqueName: \"kubernetes.io/projected/9de1a8ab-60ad-4326-9262-cd5c6afff9fc-kube-api-access-hl4s7\") pod \"aws-ebs-csi-driver-node-8g4nk\" (UID: \"9de1a8ab-60ad-4326-9262-cd5c6afff9fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" Apr 24 22:29:42.695023 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.694988 2582 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:42.722038 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.722007 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qrs6\" (UniqueName: \"kubernetes.io/projected/722d1910-3c2f-4e70-af24-daf9f78fcf06-kube-api-access-7qrs6\") pod \"node-resolver-wtlmx\" (UID: \"722d1910-3c2f-4e70-af24-daf9f78fcf06\") " pod="openshift-dns/node-resolver-wtlmx" Apr 24 22:29:42.722226 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.722076 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/722d1910-3c2f-4e70-af24-daf9f78fcf06-hosts-file\") pod \"node-resolver-wtlmx\" (UID: \"722d1910-3c2f-4e70-af24-daf9f78fcf06\") " pod="openshift-dns/node-resolver-wtlmx" Apr 24 22:29:42.722226 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.722160 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/722d1910-3c2f-4e70-af24-daf9f78fcf06-tmp-dir\") pod \"node-resolver-wtlmx\" (UID: \"722d1910-3c2f-4e70-af24-daf9f78fcf06\") " pod="openshift-dns/node-resolver-wtlmx" Apr 24 22:29:42.722327 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.722234 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/722d1910-3c2f-4e70-af24-daf9f78fcf06-hosts-file\") pod \"node-resolver-wtlmx\" (UID: \"722d1910-3c2f-4e70-af24-daf9f78fcf06\") " pod="openshift-dns/node-resolver-wtlmx" Apr 24 22:29:42.722503 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.722487 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/722d1910-3c2f-4e70-af24-daf9f78fcf06-tmp-dir\") pod \"node-resolver-wtlmx\" (UID: \"722d1910-3c2f-4e70-af24-daf9f78fcf06\") " pod="openshift-dns/node-resolver-wtlmx" Apr 24 22:29:42.733096 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.733003 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qrs6\" (UniqueName: \"kubernetes.io/projected/722d1910-3c2f-4e70-af24-daf9f78fcf06-kube-api-access-7qrs6\") pod \"node-resolver-wtlmx\" (UID: \"722d1910-3c2f-4e70-af24-daf9f78fcf06\") " pod="openshift-dns/node-resolver-wtlmx" Apr 24 22:29:42.808998 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.808947 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j4t6h" Apr 24 22:29:42.814807 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.814765 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:29:42.822778 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.822759 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-kg2xf" Apr 24 22:29:42.829414 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.829390 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9wphl" Apr 24 22:29:42.837369 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.837349 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-z9q4w" Apr 24 22:29:42.846939 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.846908 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-pqn8r" Apr 24 22:29:42.853722 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.853699 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" Apr 24 22:29:42.859387 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.859362 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7pchr" Apr 24 22:29:42.864956 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:42.864935 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wtlmx" Apr 24 22:29:43.124993 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:43.124959 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/171d0bdf-1d87-4aee-9fad-9c28075596bd-metrics-certs\") pod \"network-metrics-daemon-hgvbb\" (UID: \"171d0bdf-1d87-4aee-9fad-9c28075596bd\") " pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:29:43.125190 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:43.125167 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:43.125270 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:43.125254 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/171d0bdf-1d87-4aee-9fad-9c28075596bd-metrics-certs podName:171d0bdf-1d87-4aee-9fad-9c28075596bd nodeName:}" failed. No retries permitted until 2026-04-24 22:29:44.125233168 +0000 UTC m=+4.069024491 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/171d0bdf-1d87-4aee-9fad-9c28075596bd-metrics-certs") pod "network-metrics-daemon-hgvbb" (UID: "171d0bdf-1d87-4aee-9fad-9c28075596bd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:43.220111 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:43.218364 2582 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:43.225337 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:43.225312 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vznvq\" (UniqueName: \"kubernetes.io/projected/b5d8eefa-153f-46d6-8848-82778399a098-kube-api-access-vznvq\") pod \"network-check-target-7sw9z\" (UID: \"b5d8eefa-153f-46d6-8848-82778399a098\") " pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:29:43.225450 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:43.225435 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:43.225499 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:43.225455 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:43.225499 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:43.225466 2582 projected.go:194] Error preparing data for projected volume kube-api-access-vznvq for pod openshift-network-diagnostics/network-check-target-7sw9z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:43.225601 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:43.225513 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b5d8eefa-153f-46d6-8848-82778399a098-kube-api-access-vznvq podName:b5d8eefa-153f-46d6-8848-82778399a098 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:44.225499513 +0000 UTC m=+4.169290821 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-vznvq" (UniqueName: "kubernetes.io/projected/b5d8eefa-153f-46d6-8848-82778399a098-kube-api-access-vznvq") pod "network-check-target-7sw9z" (UID: "b5d8eefa-153f-46d6-8848-82778399a098") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:43.247171 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:43.247121 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod239c26d8_bd64_4f99_9455_4fceceb609ee.slice/crio-93cd04661e50bcb040e51d8abb7fe7f94b6518f9aca127b6b3a1aa3b60975450 WatchSource:0}: Error finding container 93cd04661e50bcb040e51d8abb7fe7f94b6518f9aca127b6b3a1aa3b60975450: Status 404 returned error can't find the container with id 93cd04661e50bcb040e51d8abb7fe7f94b6518f9aca127b6b3a1aa3b60975450 Apr 24 22:29:43.248812 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:43.248790 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4851636_e409_4338_9170_49d3547b7af4.slice/crio-679b7ada808f2a4c50faa689ee25fa749cb74fdca208163140255a9b4b2ad149 WatchSource:0}: Error finding container 679b7ada808f2a4c50faa689ee25fa749cb74fdca208163140255a9b4b2ad149: Status 404 returned error can't find the container with id 679b7ada808f2a4c50faa689ee25fa749cb74fdca208163140255a9b4b2ad149 Apr 24 22:29:43.250234 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:43.250165 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod722d1910_3c2f_4e70_af24_daf9f78fcf06.slice/crio-806ef4553b1048c511013e45fcdd64e8d13a441d19b580a938447a4322921360 WatchSource:0}: Error finding container 806ef4553b1048c511013e45fcdd64e8d13a441d19b580a938447a4322921360: Status 404 returned error can't find the container with id 806ef4553b1048c511013e45fcdd64e8d13a441d19b580a938447a4322921360 Apr 24 22:29:43.253632 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:43.253613 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c9ae31a_f5e8_444b_8692_a6e8b24d04ad.slice/crio-9e03bc66862563ddfa725d66e7405de608b874d06f95512a3b6f51277d1d265b WatchSource:0}: Error finding container 9e03bc66862563ddfa725d66e7405de608b874d06f95512a3b6f51277d1d265b: Status 404 returned error can't find the container with id 9e03bc66862563ddfa725d66e7405de608b874d06f95512a3b6f51277d1d265b Apr 24 22:29:43.254445 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:43.254353 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f665f36_3e6e_4199_bbcd_df474abfeb86.slice/crio-3821deb535f9466b5969a86f0feece7aad1e94545ba1a57480b3655f80814bea WatchSource:0}: Error finding container 3821deb535f9466b5969a86f0feece7aad1e94545ba1a57480b3655f80814bea: Status 404 returned error can't find the container with id 3821deb535f9466b5969a86f0feece7aad1e94545ba1a57480b3655f80814bea Apr 24 22:29:43.255338 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:43.255278 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cf45f35_3263_4dd4_83bf_caaac71acebd.slice/crio-358785041b91f593e1b0b5bc2110f96a0a20e4a94d848dce7b1027b36e900c6b WatchSource:0}: Error finding container 358785041b91f593e1b0b5bc2110f96a0a20e4a94d848dce7b1027b36e900c6b: Status 404 returned error can't find the container with id 358785041b91f593e1b0b5bc2110f96a0a20e4a94d848dce7b1027b36e900c6b Apr 24 22:29:43.258979 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:43.257203 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44759cca_eeb7_4b34_af4d_65cef31d60a1.slice/crio-a88f84f1852f078ea3cbfcd5ea77480e73f986c973097a2a05b033ec84d03957 WatchSource:0}: Error finding container a88f84f1852f078ea3cbfcd5ea77480e73f986c973097a2a05b033ec84d03957: Status 404 returned error can't find the container with id a88f84f1852f078ea3cbfcd5ea77480e73f986c973097a2a05b033ec84d03957 Apr 24 22:29:43.258979 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:43.258308 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4b8fbfd_f18c_4b29_9c01_547311bd0ba6.slice/crio-fa60f8d3be493b50afdccdf692623c0aef206677c45a7b10763112e301a08d08 WatchSource:0}: Error finding container fa60f8d3be493b50afdccdf692623c0aef206677c45a7b10763112e301a08d08: Status 404 returned error can't find the container with id fa60f8d3be493b50afdccdf692623c0aef206677c45a7b10763112e301a08d08 Apr 24 22:29:43.259453 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:29:43.259226 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9de1a8ab_60ad_4326_9262_cd5c6afff9fc.slice/crio-24905a53d7878c13ac0c849c54c23ca0219e4cf9aa5d4b7ece5c008a4b363d9d WatchSource:0}: Error finding container 24905a53d7878c13ac0c849c54c23ca0219e4cf9aa5d4b7ece5c008a4b363d9d: Status 404 returned error can't find the container with id 24905a53d7878c13ac0c849c54c23ca0219e4cf9aa5d4b7ece5c008a4b363d9d Apr 24 22:29:43.552207 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:43.551963 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 22:24:41 +0000 UTC" deadline="2027-10-08 11:58:02.804219776 +0000 UTC" Apr 24 22:29:43.552207 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:43.552202 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12757h28m19.2520218s" Apr 24 22:29:43.632469 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:43.632224 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:29:43.632469 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:43.632352 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7sw9z" podUID="b5d8eefa-153f-46d6-8848-82778399a098" Apr 24 22:29:43.643420 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:43.643347 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9wphl" event={"ID":"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6","Type":"ContainerStarted","Data":"fa60f8d3be493b50afdccdf692623c0aef206677c45a7b10763112e301a08d08"} Apr 24 22:29:43.652504 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:43.652439 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j4t6h" event={"ID":"4f665f36-3e6e-4199-bbcd-df474abfeb86","Type":"ContainerStarted","Data":"3821deb535f9466b5969a86f0feece7aad1e94545ba1a57480b3655f80814bea"} Apr 24 22:29:43.658687 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:43.658649 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z9q4w" event={"ID":"44759cca-eeb7-4b34-af4d-65cef31d60a1","Type":"ContainerStarted","Data":"a88f84f1852f078ea3cbfcd5ea77480e73f986c973097a2a05b033ec84d03957"} Apr 24 22:29:43.660531 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:43.660494 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-kg2xf" event={"ID":"3c9ae31a-f5e8-444b-8692-a6e8b24d04ad","Type":"ContainerStarted","Data":"9e03bc66862563ddfa725d66e7405de608b874d06f95512a3b6f51277d1d265b"} Apr 24 22:29:43.665648 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:43.665616 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wtlmx" event={"ID":"722d1910-3c2f-4e70-af24-daf9f78fcf06","Type":"ContainerStarted","Data":"806ef4553b1048c511013e45fcdd64e8d13a441d19b580a938447a4322921360"} Apr 24 22:29:43.668994 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:43.668963 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7pchr" event={"ID":"f4851636-e409-4338-9170-49d3547b7af4","Type":"ContainerStarted","Data":"679b7ada808f2a4c50faa689ee25fa749cb74fdca208163140255a9b4b2ad149"} Apr 24 22:29:43.674014 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:43.673981 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-222.ec2.internal" event={"ID":"b28c4947c642c3a6ca28e65c847e2583","Type":"ContainerStarted","Data":"4fbf6d0eb1e7c254986000f366f8e55579d98dadb05c47ef1481432e672876e2"} Apr 24 22:29:43.680934 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:43.680903 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-pqn8r" event={"ID":"7cf45f35-3263-4dd4-83bf-caaac71acebd","Type":"ContainerStarted","Data":"358785041b91f593e1b0b5bc2110f96a0a20e4a94d848dce7b1027b36e900c6b"} Apr 24 22:29:43.689508 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:43.689450 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-222.ec2.internal" podStartSLOduration=1.689429343 podStartE2EDuration="1.689429343s" podCreationTimestamp="2026-04-24 22:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:29:43.688385643 +0000 UTC m=+3.632176976" watchObservedRunningTime="2026-04-24 22:29:43.689429343 +0000 UTC m=+3.633220675" Apr 24 22:29:43.692969 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:43.692932 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" event={"ID":"239c26d8-bd64-4f99-9455-4fceceb609ee","Type":"ContainerStarted","Data":"93cd04661e50bcb040e51d8abb7fe7f94b6518f9aca127b6b3a1aa3b60975450"} Apr 24 22:29:43.700670 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:43.700622 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" event={"ID":"9de1a8ab-60ad-4326-9262-cd5c6afff9fc","Type":"ContainerStarted","Data":"24905a53d7878c13ac0c849c54c23ca0219e4cf9aa5d4b7ece5c008a4b363d9d"} Apr 24 22:29:44.133802 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:44.133764 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/171d0bdf-1d87-4aee-9fad-9c28075596bd-metrics-certs\") pod \"network-metrics-daemon-hgvbb\" (UID: \"171d0bdf-1d87-4aee-9fad-9c28075596bd\") " pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:29:44.133954 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:44.133918 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:44.134028 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:44.133973 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/171d0bdf-1d87-4aee-9fad-9c28075596bd-metrics-certs podName:171d0bdf-1d87-4aee-9fad-9c28075596bd nodeName:}" failed. No retries permitted until 2026-04-24 22:29:46.133959207 +0000 UTC m=+6.077750516 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/171d0bdf-1d87-4aee-9fad-9c28075596bd-metrics-certs") pod "network-metrics-daemon-hgvbb" (UID: "171d0bdf-1d87-4aee-9fad-9c28075596bd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:44.234679 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:44.234617 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vznvq\" (UniqueName: \"kubernetes.io/projected/b5d8eefa-153f-46d6-8848-82778399a098-kube-api-access-vznvq\") pod \"network-check-target-7sw9z\" (UID: \"b5d8eefa-153f-46d6-8848-82778399a098\") " pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:29:44.234837 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:44.234786 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:44.234837 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:44.234807 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:44.234837 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:44.234819 2582 projected.go:194] Error preparing data for projected volume kube-api-access-vznvq for pod openshift-network-diagnostics/network-check-target-7sw9z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:44.235012 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:44.234872 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b5d8eefa-153f-46d6-8848-82778399a098-kube-api-access-vznvq podName:b5d8eefa-153f-46d6-8848-82778399a098 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:46.23485335 +0000 UTC m=+6.178644667 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-vznvq" (UniqueName: "kubernetes.io/projected/b5d8eefa-153f-46d6-8848-82778399a098-kube-api-access-vznvq") pod "network-check-target-7sw9z" (UID: "b5d8eefa-153f-46d6-8848-82778399a098") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:44.635458 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:44.635425 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:29:44.635907 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:44.635562 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgvbb" podUID="171d0bdf-1d87-4aee-9fad-9c28075596bd" Apr 24 22:29:44.712951 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:44.712403 2582 generic.go:358] "Generic (PLEG): container finished" podID="404bdfcf13a36b9e1c667a45ad6916e1" containerID="f4fdca95b1d768c08e5c8ae63b6dd4de1ab7d81a3019c7fcd5299e0d31edf226" exitCode=0 Apr 24 22:29:44.712951 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:44.712909 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-222.ec2.internal" event={"ID":"404bdfcf13a36b9e1c667a45ad6916e1","Type":"ContainerDied","Data":"f4fdca95b1d768c08e5c8ae63b6dd4de1ab7d81a3019c7fcd5299e0d31edf226"} Apr 24 22:29:45.632593 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:45.632556 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:29:45.632795 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:45.632688 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7sw9z" podUID="b5d8eefa-153f-46d6-8848-82778399a098" Apr 24 22:29:45.728489 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:45.728452 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-222.ec2.internal" event={"ID":"404bdfcf13a36b9e1c667a45ad6916e1","Type":"ContainerStarted","Data":"6a23944631bd7b687789ef3d36941ada01d8d07db83a192514a729eddd23a70d"} Apr 24 22:29:46.150955 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:46.150916 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/171d0bdf-1d87-4aee-9fad-9c28075596bd-metrics-certs\") pod \"network-metrics-daemon-hgvbb\" (UID: \"171d0bdf-1d87-4aee-9fad-9c28075596bd\") " pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:29:46.151183 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:46.151120 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:46.151251 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:46.151186 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/171d0bdf-1d87-4aee-9fad-9c28075596bd-metrics-certs podName:171d0bdf-1d87-4aee-9fad-9c28075596bd nodeName:}" failed. No retries permitted until 2026-04-24 22:29:50.151167616 +0000 UTC m=+10.094958941 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/171d0bdf-1d87-4aee-9fad-9c28075596bd-metrics-certs") pod "network-metrics-daemon-hgvbb" (UID: "171d0bdf-1d87-4aee-9fad-9c28075596bd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:46.251888 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:46.251847 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vznvq\" (UniqueName: \"kubernetes.io/projected/b5d8eefa-153f-46d6-8848-82778399a098-kube-api-access-vznvq\") pod \"network-check-target-7sw9z\" (UID: \"b5d8eefa-153f-46d6-8848-82778399a098\") " pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:29:46.252091 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:46.252004 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:46.252091 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:46.252027 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:46.252091 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:46.252039 2582 projected.go:194] Error preparing data for projected volume kube-api-access-vznvq for pod openshift-network-diagnostics/network-check-target-7sw9z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:46.252252 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:46.252123 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b5d8eefa-153f-46d6-8848-82778399a098-kube-api-access-vznvq podName:b5d8eefa-153f-46d6-8848-82778399a098 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:50.252102573 +0000 UTC m=+10.195893897 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-vznvq" (UniqueName: "kubernetes.io/projected/b5d8eefa-153f-46d6-8848-82778399a098-kube-api-access-vznvq") pod "network-check-target-7sw9z" (UID: "b5d8eefa-153f-46d6-8848-82778399a098") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:46.634928 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:46.634340 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:29:46.634928 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:46.634478 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgvbb" podUID="171d0bdf-1d87-4aee-9fad-9c28075596bd" Apr 24 22:29:47.632575 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:47.632539 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:29:47.633107 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:47.632674 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7sw9z" podUID="b5d8eefa-153f-46d6-8848-82778399a098" Apr 24 22:29:48.634829 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:48.634336 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:29:48.634829 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:48.634482 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgvbb" podUID="171d0bdf-1d87-4aee-9fad-9c28075596bd" Apr 24 22:29:49.632588 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:49.632049 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:29:49.632588 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:49.632194 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7sw9z" podUID="b5d8eefa-153f-46d6-8848-82778399a098" Apr 24 22:29:50.186764 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:50.186141 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/171d0bdf-1d87-4aee-9fad-9c28075596bd-metrics-certs\") pod \"network-metrics-daemon-hgvbb\" (UID: \"171d0bdf-1d87-4aee-9fad-9c28075596bd\") " pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:29:50.186764 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:50.186299 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:50.186764 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:50.186363 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/171d0bdf-1d87-4aee-9fad-9c28075596bd-metrics-certs podName:171d0bdf-1d87-4aee-9fad-9c28075596bd nodeName:}" failed. No retries permitted until 2026-04-24 22:29:58.186343931 +0000 UTC m=+18.130135246 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/171d0bdf-1d87-4aee-9fad-9c28075596bd-metrics-certs") pod "network-metrics-daemon-hgvbb" (UID: "171d0bdf-1d87-4aee-9fad-9c28075596bd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:50.286583 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:50.286540 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vznvq\" (UniqueName: \"kubernetes.io/projected/b5d8eefa-153f-46d6-8848-82778399a098-kube-api-access-vznvq\") pod \"network-check-target-7sw9z\" (UID: \"b5d8eefa-153f-46d6-8848-82778399a098\") " pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:29:50.286768 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:50.286719 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:50.286768 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:50.286734 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:50.286768 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:50.286743 2582 projected.go:194] Error preparing data for projected volume kube-api-access-vznvq for pod openshift-network-diagnostics/network-check-target-7sw9z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:50.286930 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:50.286786 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b5d8eefa-153f-46d6-8848-82778399a098-kube-api-access-vznvq podName:b5d8eefa-153f-46d6-8848-82778399a098 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:58.286773469 +0000 UTC m=+18.230564782 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-vznvq" (UniqueName: "kubernetes.io/projected/b5d8eefa-153f-46d6-8848-82778399a098-kube-api-access-vznvq") pod "network-check-target-7sw9z" (UID: "b5d8eefa-153f-46d6-8848-82778399a098") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:50.633257 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:50.633219 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:29:50.633435 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:50.633350 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgvbb" podUID="171d0bdf-1d87-4aee-9fad-9c28075596bd" Apr 24 22:29:51.632775 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:51.632725 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:29:51.633180 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:51.633086 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7sw9z" podUID="b5d8eefa-153f-46d6-8848-82778399a098" Apr 24 22:29:52.632879 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:52.632374 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:29:52.632879 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:52.632517 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgvbb" podUID="171d0bdf-1d87-4aee-9fad-9c28075596bd" Apr 24 22:29:53.632114 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:53.632074 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:29:53.632313 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:53.632206 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7sw9z" podUID="b5d8eefa-153f-46d6-8848-82778399a098" Apr 24 22:29:53.758138 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:53.758079 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-222.ec2.internal" podStartSLOduration=11.758046052 podStartE2EDuration="11.758046052s" podCreationTimestamp="2026-04-24 22:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:29:45.740997981 +0000 UTC m=+5.684789319" watchObservedRunningTime="2026-04-24 22:29:53.758046052 +0000 UTC m=+13.701837383" Apr 24 22:29:53.758536 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:53.758434 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-zvc7f"] Apr 24 22:29:53.761433 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:53.761413 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:29:53.761558 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:53.761498 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zvc7f" podUID="bcb4810e-da56-4f18-8ac9-65765230513d" Apr 24 22:29:53.812184 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:53.812150 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bcb4810e-da56-4f18-8ac9-65765230513d-dbus\") pod \"global-pull-secret-syncer-zvc7f\" (UID: \"bcb4810e-da56-4f18-8ac9-65765230513d\") " pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:29:53.812357 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:53.812196 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bcb4810e-da56-4f18-8ac9-65765230513d-kubelet-config\") pod \"global-pull-secret-syncer-zvc7f\" (UID: \"bcb4810e-da56-4f18-8ac9-65765230513d\") " pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:29:53.812357 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:53.812224 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bcb4810e-da56-4f18-8ac9-65765230513d-original-pull-secret\") pod \"global-pull-secret-syncer-zvc7f\" (UID: \"bcb4810e-da56-4f18-8ac9-65765230513d\") " pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:29:53.913579 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:53.913483 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bcb4810e-da56-4f18-8ac9-65765230513d-dbus\") pod \"global-pull-secret-syncer-zvc7f\" (UID: \"bcb4810e-da56-4f18-8ac9-65765230513d\") " pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:29:53.913579 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:53.913532 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bcb4810e-da56-4f18-8ac9-65765230513d-kubelet-config\") pod \"global-pull-secret-syncer-zvc7f\" (UID: \"bcb4810e-da56-4f18-8ac9-65765230513d\") " pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:29:53.913579 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:53.913560 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bcb4810e-da56-4f18-8ac9-65765230513d-original-pull-secret\") pod \"global-pull-secret-syncer-zvc7f\" (UID: \"bcb4810e-da56-4f18-8ac9-65765230513d\") " pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:29:53.913836 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:53.913723 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bcb4810e-da56-4f18-8ac9-65765230513d-dbus\") pod \"global-pull-secret-syncer-zvc7f\" (UID: \"bcb4810e-da56-4f18-8ac9-65765230513d\") " pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:29:53.913836 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:53.913741 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:53.913836 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:53.913802 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bcb4810e-da56-4f18-8ac9-65765230513d-kubelet-config\") pod \"global-pull-secret-syncer-zvc7f\" (UID: \"bcb4810e-da56-4f18-8ac9-65765230513d\") " pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:29:53.913947 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:53.913862 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb4810e-da56-4f18-8ac9-65765230513d-original-pull-secret podName:bcb4810e-da56-4f18-8ac9-65765230513d nodeName:}" failed. No retries permitted until 2026-04-24 22:29:54.413846427 +0000 UTC m=+14.357637736 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bcb4810e-da56-4f18-8ac9-65765230513d-original-pull-secret") pod "global-pull-secret-syncer-zvc7f" (UID: "bcb4810e-da56-4f18-8ac9-65765230513d") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:54.417077 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:54.417030 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bcb4810e-da56-4f18-8ac9-65765230513d-original-pull-secret\") pod \"global-pull-secret-syncer-zvc7f\" (UID: \"bcb4810e-da56-4f18-8ac9-65765230513d\") " pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:29:54.417256 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:54.417186 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:54.417256 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:54.417255 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb4810e-da56-4f18-8ac9-65765230513d-original-pull-secret podName:bcb4810e-da56-4f18-8ac9-65765230513d nodeName:}" failed. No retries permitted until 2026-04-24 22:29:55.417234771 +0000 UTC m=+15.361026080 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bcb4810e-da56-4f18-8ac9-65765230513d-original-pull-secret") pod "global-pull-secret-syncer-zvc7f" (UID: "bcb4810e-da56-4f18-8ac9-65765230513d") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:54.632095 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:54.632037 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:29:54.632277 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:54.632200 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgvbb" podUID="171d0bdf-1d87-4aee-9fad-9c28075596bd" Apr 24 22:29:55.425279 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:55.425240 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bcb4810e-da56-4f18-8ac9-65765230513d-original-pull-secret\") pod \"global-pull-secret-syncer-zvc7f\" (UID: \"bcb4810e-da56-4f18-8ac9-65765230513d\") " pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:29:55.425742 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:55.425431 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:55.425742 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:55.425517 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb4810e-da56-4f18-8ac9-65765230513d-original-pull-secret podName:bcb4810e-da56-4f18-8ac9-65765230513d nodeName:}" failed. No retries permitted until 2026-04-24 22:29:57.425494711 +0000 UTC m=+17.369286037 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bcb4810e-da56-4f18-8ac9-65765230513d-original-pull-secret") pod "global-pull-secret-syncer-zvc7f" (UID: "bcb4810e-da56-4f18-8ac9-65765230513d") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:55.631892 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:55.631849 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:29:55.632084 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:55.631850 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:29:55.632084 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:55.631981 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7sw9z" podUID="b5d8eefa-153f-46d6-8848-82778399a098" Apr 24 22:29:55.632084 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:55.632073 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zvc7f" podUID="bcb4810e-da56-4f18-8ac9-65765230513d" Apr 24 22:29:56.632239 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:56.632199 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:29:56.632677 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:56.632346 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgvbb" podUID="171d0bdf-1d87-4aee-9fad-9c28075596bd" Apr 24 22:29:57.440664 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:57.440620 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bcb4810e-da56-4f18-8ac9-65765230513d-original-pull-secret\") pod \"global-pull-secret-syncer-zvc7f\" (UID: \"bcb4810e-da56-4f18-8ac9-65765230513d\") " pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:29:57.441316 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:57.441291 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:57.441440 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:57.441380 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb4810e-da56-4f18-8ac9-65765230513d-original-pull-secret podName:bcb4810e-da56-4f18-8ac9-65765230513d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:01.441358 +0000 UTC m=+21.385149327 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bcb4810e-da56-4f18-8ac9-65765230513d-original-pull-secret") pod "global-pull-secret-syncer-zvc7f" (UID: "bcb4810e-da56-4f18-8ac9-65765230513d") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:57.632522 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:57.632487 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:29:57.632955 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:57.632615 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7sw9z" podUID="b5d8eefa-153f-46d6-8848-82778399a098" Apr 24 22:29:57.632955 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:57.632684 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:29:57.632955 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:57.632792 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zvc7f" podUID="bcb4810e-da56-4f18-8ac9-65765230513d" Apr 24 22:29:58.245241 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:58.245202 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/171d0bdf-1d87-4aee-9fad-9c28075596bd-metrics-certs\") pod \"network-metrics-daemon-hgvbb\" (UID: \"171d0bdf-1d87-4aee-9fad-9c28075596bd\") " pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:29:58.245410 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:58.245352 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:58.245452 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:58.245416 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/171d0bdf-1d87-4aee-9fad-9c28075596bd-metrics-certs podName:171d0bdf-1d87-4aee-9fad-9c28075596bd nodeName:}" failed. No retries permitted until 2026-04-24 22:30:14.245398632 +0000 UTC m=+34.189189963 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/171d0bdf-1d87-4aee-9fad-9c28075596bd-metrics-certs") pod "network-metrics-daemon-hgvbb" (UID: "171d0bdf-1d87-4aee-9fad-9c28075596bd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:58.346301 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:58.346261 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vznvq\" (UniqueName: \"kubernetes.io/projected/b5d8eefa-153f-46d6-8848-82778399a098-kube-api-access-vznvq\") pod \"network-check-target-7sw9z\" (UID: \"b5d8eefa-153f-46d6-8848-82778399a098\") " pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:29:58.346484 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:58.346423 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:58.346484 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:58.346445 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:58.346484 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:58.346459 2582 projected.go:194] Error preparing data for projected volume kube-api-access-vznvq for pod openshift-network-diagnostics/network-check-target-7sw9z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:58.346600 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:58.346522 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b5d8eefa-153f-46d6-8848-82778399a098-kube-api-access-vznvq podName:b5d8eefa-153f-46d6-8848-82778399a098 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:14.34650328 +0000 UTC m=+34.290294592 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-vznvq" (UniqueName: "kubernetes.io/projected/b5d8eefa-153f-46d6-8848-82778399a098-kube-api-access-vznvq") pod "network-check-target-7sw9z" (UID: "b5d8eefa-153f-46d6-8848-82778399a098") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:58.634899 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:58.634867 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:29:58.635325 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:58.635002 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgvbb" podUID="171d0bdf-1d87-4aee-9fad-9c28075596bd" Apr 24 22:29:59.632600 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:59.632564 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:29:59.632871 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:29:59.632538 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:29:59.632871 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:59.632696 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7sw9z" podUID="b5d8eefa-153f-46d6-8848-82778399a098" Apr 24 22:29:59.632871 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:29:59.632751 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zvc7f" podUID="bcb4810e-da56-4f18-8ac9-65765230513d" Apr 24 22:30:00.634140 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:00.634111 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:30:00.634523 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:00.634246 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgvbb" podUID="171d0bdf-1d87-4aee-9fad-9c28075596bd" Apr 24 22:30:01.471794 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:01.471424 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bcb4810e-da56-4f18-8ac9-65765230513d-original-pull-secret\") pod \"global-pull-secret-syncer-zvc7f\" (UID: \"bcb4810e-da56-4f18-8ac9-65765230513d\") " pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:30:01.471935 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:01.471726 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:01.471935 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:01.471926 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb4810e-da56-4f18-8ac9-65765230513d-original-pull-secret podName:bcb4810e-da56-4f18-8ac9-65765230513d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:09.471908434 +0000 UTC m=+29.415699744 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bcb4810e-da56-4f18-8ac9-65765230513d-original-pull-secret") pod "global-pull-secret-syncer-zvc7f" (UID: "bcb4810e-da56-4f18-8ac9-65765230513d") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:01.632203 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:01.632165 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:30:01.632203 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:01.632210 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:30:01.632429 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:01.632325 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zvc7f" podUID="bcb4810e-da56-4f18-8ac9-65765230513d" Apr 24 22:30:01.632468 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:01.632434 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7sw9z" podUID="b5d8eefa-153f-46d6-8848-82778399a098" Apr 24 22:30:01.760675 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:01.760638 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-kg2xf" event={"ID":"3c9ae31a-f5e8-444b-8692-a6e8b24d04ad","Type":"ContainerStarted","Data":"da405c55a07d2477a4f080f079831cb273983038080a8aa6057d7d7c9181743e"} Apr 24 22:30:01.762378 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:01.762351 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wtlmx" event={"ID":"722d1910-3c2f-4e70-af24-daf9f78fcf06","Type":"ContainerStarted","Data":"566a006bc0e25a1e5f84c8df65d5838e49ed61cad4914cc418c7719c17a51064"} Apr 24 22:30:01.763974 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:01.763949 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7pchr" event={"ID":"f4851636-e409-4338-9170-49d3547b7af4","Type":"ContainerStarted","Data":"3bd7d1875c8f0b0685a3693d0d8e42794c0cbe2698f684f2e8b22c290915ea98"} Apr 24 22:30:01.767213 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:01.767192 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzsgr_239c26d8-bd64-4f99-9455-4fceceb609ee/ovn-acl-logging/0.log" Apr 24 22:30:01.767575 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:01.767545 2582 generic.go:358] "Generic (PLEG): container finished" podID="239c26d8-bd64-4f99-9455-4fceceb609ee" containerID="d354112b6674885814b11404077f5a82d417a0b3fd1e19eca6dfc79273a5b4b2" exitCode=1 Apr 24 22:30:01.767695 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:01.767615 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" event={"ID":"239c26d8-bd64-4f99-9455-4fceceb609ee","Type":"ContainerStarted","Data":"a4cf0a7c704460d89d1109e1939f762df998c05c736421b43f77e64a34e69b8a"} Apr 24 22:30:01.767695 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:01.767655 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" event={"ID":"239c26d8-bd64-4f99-9455-4fceceb609ee","Type":"ContainerStarted","Data":"c8f90d11fc4c8e8a4033568b9a31149cb667ae15fcd27873adb68276bff35189"} Apr 24 22:30:01.767695 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:01.767668 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" event={"ID":"239c26d8-bd64-4f99-9455-4fceceb609ee","Type":"ContainerStarted","Data":"53b658f4e0a3ecc8500f9a2e3dbad21ba6fc6e1f977e9fe293255ac2814ac285"} Apr 24 22:30:01.767695 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:01.767676 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" event={"ID":"239c26d8-bd64-4f99-9455-4fceceb609ee","Type":"ContainerStarted","Data":"e94bf475f3c0a5691aecfb4f1de4eb7dd56aeffc0fbef1b3b7b25c56d133fef4"} Apr 24 22:30:01.767695 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:01.767683 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" event={"ID":"239c26d8-bd64-4f99-9455-4fceceb609ee","Type":"ContainerDied","Data":"d354112b6674885814b11404077f5a82d417a0b3fd1e19eca6dfc79273a5b4b2"} Apr 24 22:30:01.767695 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:01.767692 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" event={"ID":"239c26d8-bd64-4f99-9455-4fceceb609ee","Type":"ContainerStarted","Data":"c386c16c4e43a4f49e8cf8034dcae6a56cee973d40803dc418520fb47a161609"} Apr 24 22:30:01.769122 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:01.769100 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" event={"ID":"9de1a8ab-60ad-4326-9262-cd5c6afff9fc","Type":"ContainerStarted","Data":"2c0ced20c5d642b67d0f80eeefa7850cb1efff9264d62cb0c8c2d451a11cfa8a"} Apr 24 22:30:01.770772 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:01.770739 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9wphl" event={"ID":"d4b8fbfd-f18c-4b29-9c01-547311bd0ba6","Type":"ContainerStarted","Data":"8950c594ff9a5ae95a8a3dd1118ef61d0fa5000e1ea23105fe72e245cd086554"} Apr 24 22:30:01.772153 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:01.772124 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j4t6h" event={"ID":"4f665f36-3e6e-4199-bbcd-df474abfeb86","Type":"ContainerStarted","Data":"262c5d7ef350e6cc8957cf55fe0bee8a250e5bbac2c9783980334c9c70e23b44"} Apr 24 22:30:01.773742 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:01.773715 2582 generic.go:358] "Generic (PLEG): container finished" podID="44759cca-eeb7-4b34-af4d-65cef31d60a1" containerID="d4d3b5870aa97778ea30734b0874e08d8c86090c135e528abcf9d930d36fc739" exitCode=0 Apr 24 22:30:01.773828 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:01.773752 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z9q4w" event={"ID":"44759cca-eeb7-4b34-af4d-65cef31d60a1","Type":"ContainerDied","Data":"d4d3b5870aa97778ea30734b0874e08d8c86090c135e528abcf9d930d36fc739"} Apr 24 22:30:01.778870 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:01.777908 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-kg2xf" podStartSLOduration=4.453088854 podStartE2EDuration="21.777892755s" podCreationTimestamp="2026-04-24 22:29:40 +0000 UTC" firstStartedPulling="2026-04-24 22:29:43.255838441 +0000 UTC m=+3.199629754" lastFinishedPulling="2026-04-24 22:30:00.580642341 +0000 UTC m=+20.524433655" observedRunningTime="2026-04-24 22:30:01.777537661 +0000 UTC m=+21.721328992" watchObservedRunningTime="2026-04-24 22:30:01.777892755 +0000 UTC m=+21.721684090" Apr 24 22:30:01.795707 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:01.795662 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-j4t6h" podStartSLOduration=12.490909863 podStartE2EDuration="21.795647421s" podCreationTimestamp="2026-04-24 22:29:40 +0000 UTC" firstStartedPulling="2026-04-24 22:29:43.256267443 +0000 UTC m=+3.200058765" lastFinishedPulling="2026-04-24 22:29:52.56100501 +0000 UTC m=+12.504796323" observedRunningTime="2026-04-24 22:30:01.795633739 +0000 UTC m=+21.739425070" watchObservedRunningTime="2026-04-24 22:30:01.795647421 +0000 UTC m=+21.739438752" Apr 24 22:30:01.821072 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:01.821002 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9wphl" podStartSLOduration=4.458950959 podStartE2EDuration="21.820986292s" podCreationTimestamp="2026-04-24 22:29:40 +0000 UTC" firstStartedPulling="2026-04-24 22:29:43.260005692 +0000 UTC m=+3.203797004" lastFinishedPulling="2026-04-24 22:30:00.622041015 +0000 UTC m=+20.565832337" observedRunningTime="2026-04-24 22:30:01.819899507 +0000 UTC m=+21.763690837" watchObservedRunningTime="2026-04-24 22:30:01.820986292 +0000 UTC m=+21.764777623" Apr 24 22:30:01.845091 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:01.845023 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-7pchr" podStartSLOduration=4.49125009 podStartE2EDuration="21.845008857s" podCreationTimestamp="2026-04-24 22:29:40 +0000 UTC" firstStartedPulling="2026-04-24 22:29:43.250583918 +0000 UTC m=+3.194375232" lastFinishedPulling="2026-04-24 22:30:00.604342677 +0000 UTC m=+20.548133999" observedRunningTime="2026-04-24 22:30:01.844595342 +0000 UTC m=+21.788386673" watchObservedRunningTime="2026-04-24 22:30:01.845008857 +0000 UTC m=+21.788800187" Apr 24 22:30:01.948051 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:01.948007 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-wtlmx" podStartSLOduration=4.59844936 podStartE2EDuration="21.947990908s" podCreationTimestamp="2026-04-24 22:29:40 +0000 UTC" firstStartedPulling="2026-04-24 22:29:43.25198357 +0000 UTC m=+3.195774882" lastFinishedPulling="2026-04-24 22:30:00.601525105 +0000 UTC m=+20.545316430" observedRunningTime="2026-04-24 22:30:01.873946184 +0000 UTC m=+21.817737527" watchObservedRunningTime="2026-04-24 22:30:01.947990908 +0000 UTC m=+21.891782273" Apr 24 22:30:02.272291 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:02.272258 2582 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 22:30:02.579198 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:02.579039 2582 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T22:30:02.272284266Z","UUID":"f7cb69bd-f852-44dc-90f4-cab170b746a6","Handler":null,"Name":"","Endpoint":""} Apr 24 22:30:02.581294 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:02.581270 2582 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 22:30:02.581294 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:02.581302 2582 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 22:30:02.632695 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:02.632662 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:30:02.632853 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:02.632811 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgvbb" podUID="171d0bdf-1d87-4aee-9fad-9c28075596bd" Apr 24 22:30:02.777534 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:02.777495 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-pqn8r" event={"ID":"7cf45f35-3263-4dd4-83bf-caaac71acebd","Type":"ContainerStarted","Data":"3faa678f58501ef3822689ee7d9de55297409bade022f8d14dc2e60a41957751"} Apr 24 22:30:02.780705 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:02.780674 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" event={"ID":"9de1a8ab-60ad-4326-9262-cd5c6afff9fc","Type":"ContainerStarted","Data":"92363ef0430dbe1146de33d06a18ae304377ee417da236e9797f93a0eec02d66"} Apr 24 22:30:02.801180 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:02.801123 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-pqn8r" podStartSLOduration=5.478066569 podStartE2EDuration="22.801104442s" podCreationTimestamp="2026-04-24 22:29:40 +0000 UTC" firstStartedPulling="2026-04-24 22:29:43.257605431 +0000 UTC m=+3.201396739" lastFinishedPulling="2026-04-24 22:30:00.580643289 +0000 UTC m=+20.524434612" observedRunningTime="2026-04-24 22:30:02.800365296 +0000 UTC m=+22.744156627" watchObservedRunningTime="2026-04-24 22:30:02.801104442 +0000 UTC m=+22.744895773" Apr 24 22:30:03.632321 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:03.632285 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:30:03.632461 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:03.632396 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7sw9z" podUID="b5d8eefa-153f-46d6-8848-82778399a098" Apr 24 22:30:03.632515 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:03.632473 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:30:03.632548 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:03.632527 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zvc7f" podUID="bcb4810e-da56-4f18-8ac9-65765230513d" Apr 24 22:30:03.785312 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:03.784990 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzsgr_239c26d8-bd64-4f99-9455-4fceceb609ee/ovn-acl-logging/0.log" Apr 24 22:30:03.785726 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:03.785583 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" event={"ID":"239c26d8-bd64-4f99-9455-4fceceb609ee","Type":"ContainerStarted","Data":"9e2fdf8e42cf591ecad2e9fe3686461e9d1172ac31210e7e749516d1361f26f9"} Apr 24 22:30:03.787560 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:03.787528 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" event={"ID":"9de1a8ab-60ad-4326-9262-cd5c6afff9fc","Type":"ContainerStarted","Data":"f77d131c544f5639fa952c64cb3bb516755ef3de20475162929ffa407844903a"} Apr 24 22:30:03.807170 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:03.807106 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4nk" podStartSLOduration=3.610020876 podStartE2EDuration="23.807087038s" podCreationTimestamp="2026-04-24 22:29:40 +0000 UTC" firstStartedPulling="2026-04-24 22:29:43.261842595 +0000 UTC m=+3.205633907" lastFinishedPulling="2026-04-24 22:30:03.45890876 +0000 UTC m=+23.402700069" observedRunningTime="2026-04-24 22:30:03.806799327 +0000 UTC m=+23.750590658" watchObservedRunningTime="2026-04-24 22:30:03.807087038 +0000 UTC m=+23.750878371" Apr 24 22:30:04.632782 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:04.632747 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:30:04.632974 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:04.632901 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgvbb" podUID="171d0bdf-1d87-4aee-9fad-9c28075596bd" Apr 24 22:30:05.632457 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:05.632415 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:30:05.632457 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:05.632455 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:30:05.633088 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:05.632556 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7sw9z" podUID="b5d8eefa-153f-46d6-8848-82778399a098" Apr 24 22:30:05.633088 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:05.632690 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zvc7f" podUID="bcb4810e-da56-4f18-8ac9-65765230513d" Apr 24 22:30:05.846857 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:05.846810 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-kg2xf" Apr 24 22:30:05.847534 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:05.847509 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-kg2xf" Apr 24 22:30:06.632506 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:06.632328 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:30:06.633322 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:06.632599 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgvbb" podUID="171d0bdf-1d87-4aee-9fad-9c28075596bd" Apr 24 22:30:06.795399 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:06.795373 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzsgr_239c26d8-bd64-4f99-9455-4fceceb609ee/ovn-acl-logging/0.log" Apr 24 22:30:06.795767 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:06.795741 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" event={"ID":"239c26d8-bd64-4f99-9455-4fceceb609ee","Type":"ContainerStarted","Data":"f62f22f04ce1c59a634bf90b86ea075062ea24495352ff5e0ffd62262f05887c"} Apr 24 22:30:06.796113 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:06.796094 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:30:06.796373 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:06.796352 2582 scope.go:117] "RemoveContainer" containerID="d354112b6674885814b11404077f5a82d417a0b3fd1e19eca6dfc79273a5b4b2" Apr 24 22:30:06.797931 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:06.797908 2582 generic.go:358] "Generic (PLEG): container finished" podID="44759cca-eeb7-4b34-af4d-65cef31d60a1" containerID="ecd351ee5a319bdf95bf682dea188074c8de509ffb3e487952a3cb66e04bdcb0" exitCode=0 Apr 24 22:30:06.798036 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:06.797978 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z9q4w" event={"ID":"44759cca-eeb7-4b34-af4d-65cef31d60a1","Type":"ContainerDied","Data":"ecd351ee5a319bdf95bf682dea188074c8de509ffb3e487952a3cb66e04bdcb0"} Apr 24 22:30:06.798263 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:06.798240 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-kg2xf" Apr 24 22:30:06.798843 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:06.798828 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-kg2xf" Apr 24 22:30:06.812716 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:06.812690 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:30:07.631824 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:07.631796 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:30:07.631824 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:07.631823 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:30:07.632019 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:07.631913 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zvc7f" podUID="bcb4810e-da56-4f18-8ac9-65765230513d" Apr 24 22:30:07.632073 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:07.632013 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7sw9z" podUID="b5d8eefa-153f-46d6-8848-82778399a098" Apr 24 22:30:07.801623 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:07.801583 2582 generic.go:358] "Generic (PLEG): container finished" podID="44759cca-eeb7-4b34-af4d-65cef31d60a1" containerID="b9cf456d2548650d2309556778555440e79a493f1124f0d6cd44256207234d4c" exitCode=0 Apr 24 22:30:07.802047 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:07.801675 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z9q4w" event={"ID":"44759cca-eeb7-4b34-af4d-65cef31d60a1","Type":"ContainerDied","Data":"b9cf456d2548650d2309556778555440e79a493f1124f0d6cd44256207234d4c"} Apr 24 22:30:07.805157 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:07.805137 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzsgr_239c26d8-bd64-4f99-9455-4fceceb609ee/ovn-acl-logging/0.log" Apr 24 22:30:07.805519 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:07.805475 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" event={"ID":"239c26d8-bd64-4f99-9455-4fceceb609ee","Type":"ContainerStarted","Data":"adf804cc4649f65106f99db68551cc58b8307c83dfe0afff0104f083896aee5e"} Apr 24 22:30:07.805830 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:07.805815 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:30:07.805909 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:07.805835 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:30:07.821896 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:07.821870 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:30:07.879711 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:07.879653 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" podStartSLOduration=10.464830503 podStartE2EDuration="27.879638911s" podCreationTimestamp="2026-04-24 22:29:40 +0000 UTC" firstStartedPulling="2026-04-24 22:29:43.249142226 +0000 UTC m=+3.192933548" lastFinishedPulling="2026-04-24 22:30:00.66395063 +0000 UTC m=+20.607741956" observedRunningTime="2026-04-24 22:30:07.877993624 +0000 UTC m=+27.821784979" watchObservedRunningTime="2026-04-24 22:30:07.879638911 +0000 UTC m=+27.823430242" Apr 24 22:30:07.939134 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:07.939099 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-zvc7f"] Apr 24 22:30:07.939303 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:07.939219 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:30:07.939363 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:07.939335 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zvc7f" podUID="bcb4810e-da56-4f18-8ac9-65765230513d" Apr 24 22:30:07.944248 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:07.944196 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7sw9z"] Apr 24 22:30:07.944385 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:07.944334 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:30:07.944499 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:07.944472 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7sw9z" podUID="b5d8eefa-153f-46d6-8848-82778399a098" Apr 24 22:30:07.944796 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:07.944762 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hgvbb"] Apr 24 22:30:07.944919 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:07.944895 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:30:07.945039 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:07.945016 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgvbb" podUID="171d0bdf-1d87-4aee-9fad-9c28075596bd" Apr 24 22:30:08.809318 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:08.809143 2582 generic.go:358] "Generic (PLEG): container finished" podID="44759cca-eeb7-4b34-af4d-65cef31d60a1" containerID="b2a542bd42315a5151dd45764321593a37048847151ad4dacf0a413f63497b13" exitCode=0 Apr 24 22:30:08.809725 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:08.809235 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z9q4w" event={"ID":"44759cca-eeb7-4b34-af4d-65cef31d60a1","Type":"ContainerDied","Data":"b2a542bd42315a5151dd45764321593a37048847151ad4dacf0a413f63497b13"} Apr 24 22:30:09.533037 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:09.532998 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bcb4810e-da56-4f18-8ac9-65765230513d-original-pull-secret\") pod \"global-pull-secret-syncer-zvc7f\" (UID: \"bcb4810e-da56-4f18-8ac9-65765230513d\") " pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:30:09.533263 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:09.533184 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:09.533336 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:09.533265 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb4810e-da56-4f18-8ac9-65765230513d-original-pull-secret podName:bcb4810e-da56-4f18-8ac9-65765230513d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:25.533244837 +0000 UTC m=+45.477036154 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bcb4810e-da56-4f18-8ac9-65765230513d-original-pull-secret") pod "global-pull-secret-syncer-zvc7f" (UID: "bcb4810e-da56-4f18-8ac9-65765230513d") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:09.632370 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:09.632338 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:30:09.632570 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:09.632534 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zvc7f" podUID="bcb4810e-da56-4f18-8ac9-65765230513d" Apr 24 22:30:09.632702 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:09.632584 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:30:09.632702 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:09.632624 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:30:09.632807 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:09.632722 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7sw9z" podUID="b5d8eefa-153f-46d6-8848-82778399a098" Apr 24 22:30:09.632867 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:09.632802 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgvbb" podUID="171d0bdf-1d87-4aee-9fad-9c28075596bd" Apr 24 22:30:11.632824 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:11.632747 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:30:11.633458 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:11.632880 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7sw9z" podUID="b5d8eefa-153f-46d6-8848-82778399a098" Apr 24 22:30:11.633458 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:11.632758 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:30:11.633458 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:11.632988 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgvbb" podUID="171d0bdf-1d87-4aee-9fad-9c28075596bd" Apr 24 22:30:11.633458 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:11.632747 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:30:11.633458 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:11.633083 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zvc7f" podUID="bcb4810e-da56-4f18-8ac9-65765230513d" Apr 24 22:30:13.365345 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.365313 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-222.ec2.internal" event="NodeReady" Apr 24 22:30:13.365783 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.365470 2582 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 22:30:13.417559 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.417519 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5568c976c5-5n4wk"] Apr 24 22:30:13.424715 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.424685 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:13.429034 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.428652 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 22:30:13.429034 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.428714 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 22:30:13.429034 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.428846 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 22:30:13.429361 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.429322 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-lr488\"" Apr 24 22:30:13.437402 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.437371 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vp89s"] Apr 24 22:30:13.440464 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.440440 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qchcv"] Apr 24 22:30:13.440646 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.440619 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vp89s" Apr 24 22:30:13.443231 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.443204 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 22:30:13.443371 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.443261 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qchcv" Apr 24 22:30:13.444353 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.444330 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qwslk\"" Apr 24 22:30:13.444486 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.444336 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 22:30:13.446236 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.446214 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 22:30:13.446236 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.446221 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 22:30:13.446364 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.446293 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-x6wvt\"" Apr 24 22:30:13.446364 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.446323 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 22:30:13.446902 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.446882 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 22:30:13.467097 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.467047 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5568c976c5-5n4wk"] Apr 24 22:30:13.468609 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.468580 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vp89s"] Apr 24 22:30:13.468748 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.468623 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qchcv"] Apr 24 22:30:13.562601 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.562559 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-trusted-ca\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:13.562803 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.562625 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-certificates\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:13.562803 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.562671 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-installation-pull-secrets\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:13.562803 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.562702 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4fd3bad0-b406-44c2-b540-cbbaf1436e6d-cert\") pod \"ingress-canary-qchcv\" (UID: \"4fd3bad0-b406-44c2-b540-cbbaf1436e6d\") " pod="openshift-ingress-canary/ingress-canary-qchcv" Apr 24 22:30:13.562803 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.562729 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-bound-sa-token\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:13.562803 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.562780 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d25320e2-53da-44c0-bfb0-8ed3f795faf6-metrics-tls\") pod \"dns-default-vp89s\" (UID: \"d25320e2-53da-44c0-bfb0-8ed3f795faf6\") " pod="openshift-dns/dns-default-vp89s" Apr 24 22:30:13.562803 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.562807 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d25320e2-53da-44c0-bfb0-8ed3f795faf6-tmp-dir\") pod \"dns-default-vp89s\" (UID: \"d25320e2-53da-44c0-bfb0-8ed3f795faf6\") " pod="openshift-dns/dns-default-vp89s" Apr 24 22:30:13.563128 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.562837 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvkqs\" (UniqueName: \"kubernetes.io/projected/4fd3bad0-b406-44c2-b540-cbbaf1436e6d-kube-api-access-qvkqs\") pod \"ingress-canary-qchcv\" (UID: \"4fd3bad0-b406-44c2-b540-cbbaf1436e6d\") " pod="openshift-ingress-canary/ingress-canary-qchcv" Apr 24 22:30:13.563128 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.562861 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-image-registry-private-configuration\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:13.563128 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.562884 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxcl6\" (UniqueName: \"kubernetes.io/projected/d25320e2-53da-44c0-bfb0-8ed3f795faf6-kube-api-access-jxcl6\") pod \"dns-default-vp89s\" (UID: \"d25320e2-53da-44c0-bfb0-8ed3f795faf6\") " pod="openshift-dns/dns-default-vp89s" Apr 24 22:30:13.563128 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.562911 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d25320e2-53da-44c0-bfb0-8ed3f795faf6-config-volume\") pod \"dns-default-vp89s\" (UID: \"d25320e2-53da-44c0-bfb0-8ed3f795faf6\") " pod="openshift-dns/dns-default-vp89s" Apr 24 22:30:13.563128 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.562934 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-ca-trust-extracted\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:13.563128 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.562951 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5q75\" (UniqueName: \"kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-kube-api-access-x5q75\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:13.563128 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.563088 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-tls\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:13.632023 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.631991 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:30:13.632230 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.632027 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:30:13.632230 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.632106 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:30:13.643280 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.643253 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 22:30:13.643728 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.643706 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-74npb\"" Apr 24 22:30:13.644052 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.643982 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 22:30:13.644052 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.644044 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 22:30:13.644385 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.644370 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 22:30:13.644385 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.644376 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-499pd\"" Apr 24 22:30:13.663623 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.663590 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-tls\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:13.663623 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.663631 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-trusted-ca\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:13.663852 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:13.663765 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:13.663852 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:13.663785 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5568c976c5-5n4wk: secret "image-registry-tls" not found Apr 24 22:30:13.663944 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:13.663857 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-tls podName:38193c2c-a2e9-44ba-9dd4-76687aaabfb3 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:14.163835545 +0000 UTC m=+34.107626866 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-tls") pod "image-registry-5568c976c5-5n4wk" (UID: "38193c2c-a2e9-44ba-9dd4-76687aaabfb3") : secret "image-registry-tls" not found Apr 24 22:30:13.663944 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.663771 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-certificates\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:13.664025 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.663942 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-installation-pull-secrets\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:13.664025 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.663974 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4fd3bad0-b406-44c2-b540-cbbaf1436e6d-cert\") pod \"ingress-canary-qchcv\" (UID: \"4fd3bad0-b406-44c2-b540-cbbaf1436e6d\") " pod="openshift-ingress-canary/ingress-canary-qchcv" Apr 24 22:30:13.664025 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.664000 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-bound-sa-token\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:13.664211 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:13.664138 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:13.664211 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:13.664193 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fd3bad0-b406-44c2-b540-cbbaf1436e6d-cert podName:4fd3bad0-b406-44c2-b540-cbbaf1436e6d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:14.164175775 +0000 UTC m=+34.107967084 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4fd3bad0-b406-44c2-b540-cbbaf1436e6d-cert") pod "ingress-canary-qchcv" (UID: "4fd3bad0-b406-44c2-b540-cbbaf1436e6d") : secret "canary-serving-cert" not found Apr 24 22:30:13.664316 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.664235 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d25320e2-53da-44c0-bfb0-8ed3f795faf6-metrics-tls\") pod \"dns-default-vp89s\" (UID: \"d25320e2-53da-44c0-bfb0-8ed3f795faf6\") " pod="openshift-dns/dns-default-vp89s" Apr 24 22:30:13.664316 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.664272 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d25320e2-53da-44c0-bfb0-8ed3f795faf6-tmp-dir\") pod \"dns-default-vp89s\" (UID: \"d25320e2-53da-44c0-bfb0-8ed3f795faf6\") " pod="openshift-dns/dns-default-vp89s" Apr 24 22:30:13.664316 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.664305 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvkqs\" (UniqueName: \"kubernetes.io/projected/4fd3bad0-b406-44c2-b540-cbbaf1436e6d-kube-api-access-qvkqs\") pod \"ingress-canary-qchcv\" (UID: \"4fd3bad0-b406-44c2-b540-cbbaf1436e6d\") " pod="openshift-ingress-canary/ingress-canary-qchcv" Apr 24 22:30:13.664462 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:13.664320 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:13.664462 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.664334 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-image-registry-private-configuration\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:13.664462 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:13.664369 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d25320e2-53da-44c0-bfb0-8ed3f795faf6-metrics-tls podName:d25320e2-53da-44c0-bfb0-8ed3f795faf6 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:14.164354656 +0000 UTC m=+34.108145966 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d25320e2-53da-44c0-bfb0-8ed3f795faf6-metrics-tls") pod "dns-default-vp89s" (UID: "d25320e2-53da-44c0-bfb0-8ed3f795faf6") : secret "dns-default-metrics-tls" not found Apr 24 22:30:13.664462 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.664393 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxcl6\" (UniqueName: \"kubernetes.io/projected/d25320e2-53da-44c0-bfb0-8ed3f795faf6-kube-api-access-jxcl6\") pod \"dns-default-vp89s\" (UID: \"d25320e2-53da-44c0-bfb0-8ed3f795faf6\") " pod="openshift-dns/dns-default-vp89s" Apr 24 22:30:13.664462 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.664432 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d25320e2-53da-44c0-bfb0-8ed3f795faf6-config-volume\") pod \"dns-default-vp89s\" (UID: \"d25320e2-53da-44c0-bfb0-8ed3f795faf6\") " pod="openshift-dns/dns-default-vp89s" Apr 24 22:30:13.664772 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.664468 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-ca-trust-extracted\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:13.664772 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.664493 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x5q75\" (UniqueName: \"kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-kube-api-access-x5q75\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:13.664772 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.664562 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-certificates\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:13.664772 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.664611 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d25320e2-53da-44c0-bfb0-8ed3f795faf6-tmp-dir\") pod \"dns-default-vp89s\" (UID: \"d25320e2-53da-44c0-bfb0-8ed3f795faf6\") " pod="openshift-dns/dns-default-vp89s" Apr 24 22:30:13.664772 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.664723 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-trusted-ca\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:13.664997 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.664881 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-ca-trust-extracted\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:13.665131 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.665109 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d25320e2-53da-44c0-bfb0-8ed3f795faf6-config-volume\") pod \"dns-default-vp89s\" (UID: \"d25320e2-53da-44c0-bfb0-8ed3f795faf6\") " pod="openshift-dns/dns-default-vp89s" Apr 24 22:30:13.668943 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.668918 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-installation-pull-secrets\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:13.669353 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.669327 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-image-registry-private-configuration\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:13.704907 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.704873 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5q75\" (UniqueName: \"kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-kube-api-access-x5q75\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:13.711289 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.711216 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvkqs\" (UniqueName: \"kubernetes.io/projected/4fd3bad0-b406-44c2-b540-cbbaf1436e6d-kube-api-access-qvkqs\") pod \"ingress-canary-qchcv\" (UID: \"4fd3bad0-b406-44c2-b540-cbbaf1436e6d\") " pod="openshift-ingress-canary/ingress-canary-qchcv" Apr 24 22:30:13.712993 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.712961 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-bound-sa-token\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:13.721765 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:13.721732 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxcl6\" (UniqueName: \"kubernetes.io/projected/d25320e2-53da-44c0-bfb0-8ed3f795faf6-kube-api-access-jxcl6\") pod \"dns-default-vp89s\" (UID: \"d25320e2-53da-44c0-bfb0-8ed3f795faf6\") " pod="openshift-dns/dns-default-vp89s" Apr 24 22:30:14.168494 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:14.168450 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-tls\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:14.168692 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:14.168520 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4fd3bad0-b406-44c2-b540-cbbaf1436e6d-cert\") pod \"ingress-canary-qchcv\" (UID: \"4fd3bad0-b406-44c2-b540-cbbaf1436e6d\") " pod="openshift-ingress-canary/ingress-canary-qchcv" Apr 24 22:30:14.168692 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:14.168567 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d25320e2-53da-44c0-bfb0-8ed3f795faf6-metrics-tls\") pod \"dns-default-vp89s\" (UID: \"d25320e2-53da-44c0-bfb0-8ed3f795faf6\") " pod="openshift-dns/dns-default-vp89s" Apr 24 22:30:14.168692 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:14.168627 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:14.168692 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:14.168634 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:14.168692 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:14.168658 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5568c976c5-5n4wk: secret "image-registry-tls" not found Apr 24 22:30:14.168692 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:14.168664 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:14.168692 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:14.168694 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fd3bad0-b406-44c2-b540-cbbaf1436e6d-cert podName:4fd3bad0-b406-44c2-b540-cbbaf1436e6d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:15.168674336 +0000 UTC m=+35.112465662 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4fd3bad0-b406-44c2-b540-cbbaf1436e6d-cert") pod "ingress-canary-qchcv" (UID: "4fd3bad0-b406-44c2-b540-cbbaf1436e6d") : secret "canary-serving-cert" not found Apr 24 22:30:14.168979 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:14.168713 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d25320e2-53da-44c0-bfb0-8ed3f795faf6-metrics-tls podName:d25320e2-53da-44c0-bfb0-8ed3f795faf6 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:15.168705156 +0000 UTC m=+35.112496465 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d25320e2-53da-44c0-bfb0-8ed3f795faf6-metrics-tls") pod "dns-default-vp89s" (UID: "d25320e2-53da-44c0-bfb0-8ed3f795faf6") : secret "dns-default-metrics-tls" not found Apr 24 22:30:14.168979 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:14.168728 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-tls podName:38193c2c-a2e9-44ba-9dd4-76687aaabfb3 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:15.1687203 +0000 UTC m=+35.112511609 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-tls") pod "image-registry-5568c976c5-5n4wk" (UID: "38193c2c-a2e9-44ba-9dd4-76687aaabfb3") : secret "image-registry-tls" not found Apr 24 22:30:14.269004 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:14.268959 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/171d0bdf-1d87-4aee-9fad-9c28075596bd-metrics-certs\") pod \"network-metrics-daemon-hgvbb\" (UID: \"171d0bdf-1d87-4aee-9fad-9c28075596bd\") " pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:30:14.269210 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:14.269117 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 22:30:14.269210 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:14.269189 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/171d0bdf-1d87-4aee-9fad-9c28075596bd-metrics-certs podName:171d0bdf-1d87-4aee-9fad-9c28075596bd nodeName:}" failed. No retries permitted until 2026-04-24 22:30:46.269169476 +0000 UTC m=+66.212960785 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/171d0bdf-1d87-4aee-9fad-9c28075596bd-metrics-certs") pod "network-metrics-daemon-hgvbb" (UID: "171d0bdf-1d87-4aee-9fad-9c28075596bd") : secret "metrics-daemon-secret" not found Apr 24 22:30:14.369541 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:14.369502 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vznvq\" (UniqueName: \"kubernetes.io/projected/b5d8eefa-153f-46d6-8848-82778399a098-kube-api-access-vznvq\") pod \"network-check-target-7sw9z\" (UID: \"b5d8eefa-153f-46d6-8848-82778399a098\") " pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:30:14.372714 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:14.372682 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vznvq\" (UniqueName: \"kubernetes.io/projected/b5d8eefa-153f-46d6-8848-82778399a098-kube-api-access-vznvq\") pod \"network-check-target-7sw9z\" (UID: \"b5d8eefa-153f-46d6-8848-82778399a098\") " pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:30:14.551646 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:14.551612 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:30:14.749448 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:14.749413 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7sw9z"] Apr 24 22:30:14.781341 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:30:14.781310 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5d8eefa_153f_46d6_8848_82778399a098.slice/crio-9925896bb1adceb880f277a34505a9161a905bd7153c72e2c36371a6f2c21928 WatchSource:0}: Error finding container 9925896bb1adceb880f277a34505a9161a905bd7153c72e2c36371a6f2c21928: Status 404 returned error can't find the container with id 9925896bb1adceb880f277a34505a9161a905bd7153c72e2c36371a6f2c21928 Apr 24 22:30:14.823494 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:14.823465 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7sw9z" event={"ID":"b5d8eefa-153f-46d6-8848-82778399a098","Type":"ContainerStarted","Data":"9925896bb1adceb880f277a34505a9161a905bd7153c72e2c36371a6f2c21928"} Apr 24 22:30:15.100674 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.100645 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-ff4l2"] Apr 24 22:30:15.117243 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.117216 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ff4l2" Apr 24 22:30:15.122326 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.122300 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-b8trd\"" Apr 24 22:30:15.122467 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.122300 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 22:30:15.122467 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.122300 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 22:30:15.131922 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.131901 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-ff4l2"] Apr 24 22:30:15.177164 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.177126 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4fd3bad0-b406-44c2-b540-cbbaf1436e6d-cert\") pod \"ingress-canary-qchcv\" (UID: \"4fd3bad0-b406-44c2-b540-cbbaf1436e6d\") " pod="openshift-ingress-canary/ingress-canary-qchcv" Apr 24 22:30:15.177325 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.177186 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d25320e2-53da-44c0-bfb0-8ed3f795faf6-metrics-tls\") pod \"dns-default-vp89s\" (UID: \"d25320e2-53da-44c0-bfb0-8ed3f795faf6\") " pod="openshift-dns/dns-default-vp89s" Apr 24 22:30:15.177325 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.177253 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-tls\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:15.177325 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:15.177293 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:15.177462 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:15.177347 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:15.177462 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:15.177355 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:15.177462 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:15.177374 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5568c976c5-5n4wk: secret "image-registry-tls" not found Apr 24 22:30:15.177462 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:15.177356 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fd3bad0-b406-44c2-b540-cbbaf1436e6d-cert podName:4fd3bad0-b406-44c2-b540-cbbaf1436e6d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:17.177336211 +0000 UTC m=+37.121127539 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4fd3bad0-b406-44c2-b540-cbbaf1436e6d-cert") pod "ingress-canary-qchcv" (UID: "4fd3bad0-b406-44c2-b540-cbbaf1436e6d") : secret "canary-serving-cert" not found Apr 24 22:30:15.177462 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:15.177447 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d25320e2-53da-44c0-bfb0-8ed3f795faf6-metrics-tls podName:d25320e2-53da-44c0-bfb0-8ed3f795faf6 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:17.177419601 +0000 UTC m=+37.121210922 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d25320e2-53da-44c0-bfb0-8ed3f795faf6-metrics-tls") pod "dns-default-vp89s" (UID: "d25320e2-53da-44c0-bfb0-8ed3f795faf6") : secret "dns-default-metrics-tls" not found Apr 24 22:30:15.177462 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:15.177465 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-tls podName:38193c2c-a2e9-44ba-9dd4-76687aaabfb3 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:17.177456287 +0000 UTC m=+37.121247612 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-tls") pod "image-registry-5568c976c5-5n4wk" (UID: "38193c2c-a2e9-44ba-9dd4-76687aaabfb3") : secret "image-registry-tls" not found Apr 24 22:30:15.244142 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.244109 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-s5brh"] Apr 24 22:30:15.263712 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.263680 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-s5brh" Apr 24 22:30:15.272206 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.272166 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 22:30:15.272335 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.272309 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 22:30:15.272335 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.272166 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 22:30:15.272447 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.272308 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 22:30:15.277297 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.277274 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qbjm6\"" Apr 24 22:30:15.277784 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.277722 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qvv4\" (UniqueName: \"kubernetes.io/projected/ce0f15c5-e087-48f7-8284-7dcf34afed79-kube-api-access-9qvv4\") pod \"migrator-74bb7799d9-ff4l2\" (UID: \"ce0f15c5-e087-48f7-8284-7dcf34afed79\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ff4l2" Apr 24 22:30:15.278711 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.278690 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-s5brh"] Apr 24 22:30:15.378879 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.378786 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-crio-socket\") pod \"insights-runtime-extractor-s5brh\" (UID: \"4ffa9d0c-ccdc-4b8c-b83f-12076db312b8\") " pod="openshift-insights/insights-runtime-extractor-s5brh" Apr 24 22:30:15.378879 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.378839 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s5brh\" (UID: \"4ffa9d0c-ccdc-4b8c-b83f-12076db312b8\") " pod="openshift-insights/insights-runtime-extractor-s5brh" Apr 24 22:30:15.379481 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.378920 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-data-volume\") pod \"insights-runtime-extractor-s5brh\" (UID: \"4ffa9d0c-ccdc-4b8c-b83f-12076db312b8\") " pod="openshift-insights/insights-runtime-extractor-s5brh" Apr 24 22:30:15.379481 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.378951 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-s5brh\" (UID: \"4ffa9d0c-ccdc-4b8c-b83f-12076db312b8\") " pod="openshift-insights/insights-runtime-extractor-s5brh" Apr 24 22:30:15.379481 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.379085 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qvv4\" (UniqueName: \"kubernetes.io/projected/ce0f15c5-e087-48f7-8284-7dcf34afed79-kube-api-access-9qvv4\") pod \"migrator-74bb7799d9-ff4l2\" (UID: \"ce0f15c5-e087-48f7-8284-7dcf34afed79\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ff4l2" Apr 24 22:30:15.379481 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.379115 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hb8x\" (UniqueName: \"kubernetes.io/projected/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-kube-api-access-2hb8x\") pod \"insights-runtime-extractor-s5brh\" (UID: \"4ffa9d0c-ccdc-4b8c-b83f-12076db312b8\") " pod="openshift-insights/insights-runtime-extractor-s5brh" Apr 24 22:30:15.390005 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.389975 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qvv4\" (UniqueName: \"kubernetes.io/projected/ce0f15c5-e087-48f7-8284-7dcf34afed79-kube-api-access-9qvv4\") pod \"migrator-74bb7799d9-ff4l2\" (UID: \"ce0f15c5-e087-48f7-8284-7dcf34afed79\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ff4l2" Apr 24 22:30:15.434094 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.434042 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ff4l2" Apr 24 22:30:15.479934 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.479899 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-crio-socket\") pod \"insights-runtime-extractor-s5brh\" (UID: \"4ffa9d0c-ccdc-4b8c-b83f-12076db312b8\") " pod="openshift-insights/insights-runtime-extractor-s5brh" Apr 24 22:30:15.480109 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.479944 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s5brh\" (UID: \"4ffa9d0c-ccdc-4b8c-b83f-12076db312b8\") " pod="openshift-insights/insights-runtime-extractor-s5brh" Apr 24 22:30:15.480109 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.479992 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-data-volume\") pod \"insights-runtime-extractor-s5brh\" (UID: \"4ffa9d0c-ccdc-4b8c-b83f-12076db312b8\") " pod="openshift-insights/insights-runtime-extractor-s5brh" Apr 24 22:30:15.480109 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.480015 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-s5brh\" (UID: \"4ffa9d0c-ccdc-4b8c-b83f-12076db312b8\") " pod="openshift-insights/insights-runtime-extractor-s5brh" Apr 24 22:30:15.480109 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.480083 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hb8x\" (UniqueName: \"kubernetes.io/projected/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-kube-api-access-2hb8x\") pod \"insights-runtime-extractor-s5brh\" (UID: \"4ffa9d0c-ccdc-4b8c-b83f-12076db312b8\") " pod="openshift-insights/insights-runtime-extractor-s5brh" Apr 24 22:30:15.480311 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:15.480251 2582 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 22:30:15.480380 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:15.480337 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-insights-runtime-extractor-tls podName:4ffa9d0c-ccdc-4b8c-b83f-12076db312b8 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:15.980314484 +0000 UTC m=+35.924105816 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-insights-runtime-extractor-tls") pod "insights-runtime-extractor-s5brh" (UID: "4ffa9d0c-ccdc-4b8c-b83f-12076db312b8") : secret "insights-runtime-extractor-tls" not found Apr 24 22:30:15.480518 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.480496 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-data-volume\") pod \"insights-runtime-extractor-s5brh\" (UID: \"4ffa9d0c-ccdc-4b8c-b83f-12076db312b8\") " pod="openshift-insights/insights-runtime-extractor-s5brh" Apr 24 22:30:15.480681 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.480666 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-crio-socket\") pod \"insights-runtime-extractor-s5brh\" (UID: \"4ffa9d0c-ccdc-4b8c-b83f-12076db312b8\") " pod="openshift-insights/insights-runtime-extractor-s5brh" Apr 24 22:30:15.481092 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.481049 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-s5brh\" (UID: \"4ffa9d0c-ccdc-4b8c-b83f-12076db312b8\") " pod="openshift-insights/insights-runtime-extractor-s5brh" Apr 24 22:30:15.516252 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.516184 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hb8x\" (UniqueName: \"kubernetes.io/projected/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-kube-api-access-2hb8x\") pod \"insights-runtime-extractor-s5brh\" (UID: \"4ffa9d0c-ccdc-4b8c-b83f-12076db312b8\") " pod="openshift-insights/insights-runtime-extractor-s5brh" Apr 24 22:30:15.589873 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.589817 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-ff4l2"] Apr 24 22:30:15.594115 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:30:15.594046 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce0f15c5_e087_48f7_8284_7dcf34afed79.slice/crio-6e7cff2f5ad8b2b590f4d532805048ee9e1f728db26dd691890c8e8803179a2a WatchSource:0}: Error finding container 6e7cff2f5ad8b2b590f4d532805048ee9e1f728db26dd691890c8e8803179a2a: Status 404 returned error can't find the container with id 6e7cff2f5ad8b2b590f4d532805048ee9e1f728db26dd691890c8e8803179a2a Apr 24 22:30:15.828670 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.828637 2582 generic.go:358] "Generic (PLEG): container finished" podID="44759cca-eeb7-4b34-af4d-65cef31d60a1" containerID="951ea83a6931bd462fc4a1a0b2174d8ffedae1af269f340c959cb7c22e307b97" exitCode=0 Apr 24 22:30:15.828838 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.828736 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z9q4w" event={"ID":"44759cca-eeb7-4b34-af4d-65cef31d60a1","Type":"ContainerDied","Data":"951ea83a6931bd462fc4a1a0b2174d8ffedae1af269f340c959cb7c22e307b97"} Apr 24 22:30:15.830121 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.830090 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ff4l2" event={"ID":"ce0f15c5-e087-48f7-8284-7dcf34afed79","Type":"ContainerStarted","Data":"6e7cff2f5ad8b2b590f4d532805048ee9e1f728db26dd691890c8e8803179a2a"} Apr 24 22:30:15.985477 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:15.985447 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s5brh\" (UID: \"4ffa9d0c-ccdc-4b8c-b83f-12076db312b8\") " pod="openshift-insights/insights-runtime-extractor-s5brh" Apr 24 22:30:15.985611 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:15.985560 2582 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 22:30:15.985666 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:15.985634 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-insights-runtime-extractor-tls podName:4ffa9d0c-ccdc-4b8c-b83f-12076db312b8 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:16.985616921 +0000 UTC m=+36.929408243 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-insights-runtime-extractor-tls") pod "insights-runtime-extractor-s5brh" (UID: "4ffa9d0c-ccdc-4b8c-b83f-12076db312b8") : secret "insights-runtime-extractor-tls" not found Apr 24 22:30:16.222581 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:16.222369 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wtlmx_722d1910-3c2f-4e70-af24-daf9f78fcf06/dns-node-resolver/0.log" Apr 24 22:30:16.540303 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:16.539255 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-fmj5c"] Apr 24 22:30:16.548310 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:16.548286 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-fmj5c" Apr 24 22:30:16.552586 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:16.552562 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 22:30:16.552870 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:16.552852 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 22:30:16.553151 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:16.553136 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 22:30:16.553345 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:16.553330 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-7q6ck\"" Apr 24 22:30:16.553431 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:16.553411 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 22:30:16.558439 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:16.558404 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-fmj5c"] Apr 24 22:30:16.693306 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:16.693265 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvsk4\" (UniqueName: \"kubernetes.io/projected/7350313d-9fe6-48f6-b2bc-d71d2b19525f-kube-api-access-fvsk4\") pod \"service-ca-865cb79987-fmj5c\" (UID: \"7350313d-9fe6-48f6-b2bc-d71d2b19525f\") " pod="openshift-service-ca/service-ca-865cb79987-fmj5c" Apr 24 22:30:16.693474 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:16.693437 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7350313d-9fe6-48f6-b2bc-d71d2b19525f-signing-cabundle\") pod \"service-ca-865cb79987-fmj5c\" (UID: \"7350313d-9fe6-48f6-b2bc-d71d2b19525f\") " pod="openshift-service-ca/service-ca-865cb79987-fmj5c" Apr 24 22:30:16.693528 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:16.693507 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7350313d-9fe6-48f6-b2bc-d71d2b19525f-signing-key\") pod \"service-ca-865cb79987-fmj5c\" (UID: \"7350313d-9fe6-48f6-b2bc-d71d2b19525f\") " pod="openshift-service-ca/service-ca-865cb79987-fmj5c" Apr 24 22:30:16.794042 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:16.793947 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7350313d-9fe6-48f6-b2bc-d71d2b19525f-signing-cabundle\") pod \"service-ca-865cb79987-fmj5c\" (UID: \"7350313d-9fe6-48f6-b2bc-d71d2b19525f\") " pod="openshift-service-ca/service-ca-865cb79987-fmj5c" Apr 24 22:30:16.794042 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:16.794016 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7350313d-9fe6-48f6-b2bc-d71d2b19525f-signing-key\") pod \"service-ca-865cb79987-fmj5c\" (UID: \"7350313d-9fe6-48f6-b2bc-d71d2b19525f\") " pod="openshift-service-ca/service-ca-865cb79987-fmj5c" Apr 24 22:30:16.794280 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:16.794048 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvsk4\" (UniqueName: \"kubernetes.io/projected/7350313d-9fe6-48f6-b2bc-d71d2b19525f-kube-api-access-fvsk4\") pod \"service-ca-865cb79987-fmj5c\" (UID: \"7350313d-9fe6-48f6-b2bc-d71d2b19525f\") " pod="openshift-service-ca/service-ca-865cb79987-fmj5c" Apr 24 22:30:16.794721 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:16.794680 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7350313d-9fe6-48f6-b2bc-d71d2b19525f-signing-cabundle\") pod \"service-ca-865cb79987-fmj5c\" (UID: \"7350313d-9fe6-48f6-b2bc-d71d2b19525f\") " pod="openshift-service-ca/service-ca-865cb79987-fmj5c" Apr 24 22:30:16.798313 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:16.798278 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7350313d-9fe6-48f6-b2bc-d71d2b19525f-signing-key\") pod \"service-ca-865cb79987-fmj5c\" (UID: \"7350313d-9fe6-48f6-b2bc-d71d2b19525f\") " pod="openshift-service-ca/service-ca-865cb79987-fmj5c" Apr 24 22:30:16.803843 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:16.803818 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvsk4\" (UniqueName: \"kubernetes.io/projected/7350313d-9fe6-48f6-b2bc-d71d2b19525f-kube-api-access-fvsk4\") pod \"service-ca-865cb79987-fmj5c\" (UID: \"7350313d-9fe6-48f6-b2bc-d71d2b19525f\") " pod="openshift-service-ca/service-ca-865cb79987-fmj5c" Apr 24 22:30:16.820515 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:16.820491 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-j4t6h_4f665f36-3e6e-4199-bbcd-df474abfeb86/node-ca/0.log" Apr 24 22:30:16.835207 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:16.835169 2582 generic.go:358] "Generic (PLEG): container finished" podID="44759cca-eeb7-4b34-af4d-65cef31d60a1" containerID="f84f190544c69fb867f899ae9f155850806318c9e08bf9eb107c1e129f9ff4c3" exitCode=0 Apr 24 22:30:16.835360 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:16.835258 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z9q4w" event={"ID":"44759cca-eeb7-4b34-af4d-65cef31d60a1","Type":"ContainerDied","Data":"f84f190544c69fb867f899ae9f155850806318c9e08bf9eb107c1e129f9ff4c3"} Apr 24 22:30:16.859911 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:16.859878 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-fmj5c" Apr 24 22:30:16.996039 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:16.995998 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s5brh\" (UID: \"4ffa9d0c-ccdc-4b8c-b83f-12076db312b8\") " pod="openshift-insights/insights-runtime-extractor-s5brh" Apr 24 22:30:16.996224 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:16.996155 2582 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 22:30:16.996224 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:16.996219 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-insights-runtime-extractor-tls podName:4ffa9d0c-ccdc-4b8c-b83f-12076db312b8 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:18.996200798 +0000 UTC m=+38.939992127 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-insights-runtime-extractor-tls") pod "insights-runtime-extractor-s5brh" (UID: "4ffa9d0c-ccdc-4b8c-b83f-12076db312b8") : secret "insights-runtime-extractor-tls" not found Apr 24 22:30:17.198392 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:17.198297 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-tls\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:17.198392 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:17.198376 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4fd3bad0-b406-44c2-b540-cbbaf1436e6d-cert\") pod \"ingress-canary-qchcv\" (UID: \"4fd3bad0-b406-44c2-b540-cbbaf1436e6d\") " pod="openshift-ingress-canary/ingress-canary-qchcv" Apr 24 22:30:17.198598 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:17.198428 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d25320e2-53da-44c0-bfb0-8ed3f795faf6-metrics-tls\") pod \"dns-default-vp89s\" (UID: \"d25320e2-53da-44c0-bfb0-8ed3f795faf6\") " pod="openshift-dns/dns-default-vp89s" Apr 24 22:30:17.198598 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:17.198482 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:17.198598 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:17.198501 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5568c976c5-5n4wk: secret "image-registry-tls" not found Apr 24 22:30:17.198598 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:17.198560 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-tls podName:38193c2c-a2e9-44ba-9dd4-76687aaabfb3 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:21.198541753 +0000 UTC m=+41.142333062 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-tls") pod "image-registry-5568c976c5-5n4wk" (UID: "38193c2c-a2e9-44ba-9dd4-76687aaabfb3") : secret "image-registry-tls" not found Apr 24 22:30:17.198814 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:17.198606 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:17.198814 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:17.198652 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fd3bad0-b406-44c2-b540-cbbaf1436e6d-cert podName:4fd3bad0-b406-44c2-b540-cbbaf1436e6d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:21.198640247 +0000 UTC m=+41.142431556 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4fd3bad0-b406-44c2-b540-cbbaf1436e6d-cert") pod "ingress-canary-qchcv" (UID: "4fd3bad0-b406-44c2-b540-cbbaf1436e6d") : secret "canary-serving-cert" not found Apr 24 22:30:17.198814 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:17.198670 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:17.198814 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:17.198688 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d25320e2-53da-44c0-bfb0-8ed3f795faf6-metrics-tls podName:d25320e2-53da-44c0-bfb0-8ed3f795faf6 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:21.198682571 +0000 UTC m=+41.142473880 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d25320e2-53da-44c0-bfb0-8ed3f795faf6-metrics-tls") pod "dns-default-vp89s" (UID: "d25320e2-53da-44c0-bfb0-8ed3f795faf6") : secret "dns-default-metrics-tls" not found Apr 24 22:30:18.378939 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:18.378772 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-fmj5c"] Apr 24 22:30:18.383470 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:30:18.383441 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7350313d_9fe6_48f6_b2bc_d71d2b19525f.slice/crio-fe19b9bc2ef16ead07adb20e59f5a9b58bdba979a2fd1e4ab4f7db2d22dd1c7b WatchSource:0}: Error finding container fe19b9bc2ef16ead07adb20e59f5a9b58bdba979a2fd1e4ab4f7db2d22dd1c7b: Status 404 returned error can't find the container with id fe19b9bc2ef16ead07adb20e59f5a9b58bdba979a2fd1e4ab4f7db2d22dd1c7b Apr 24 22:30:18.841358 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:18.841322 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z9q4w" event={"ID":"44759cca-eeb7-4b34-af4d-65cef31d60a1","Type":"ContainerStarted","Data":"57f7d971266b120275fe886b6f0ab7054a76a3c1af3d4edb9714c9a7d566b713"} Apr 24 22:30:18.842534 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:18.842502 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7sw9z" event={"ID":"b5d8eefa-153f-46d6-8848-82778399a098","Type":"ContainerStarted","Data":"c19e3028077bcba2fa5ba93cd3446833adb107bc9d80d2e151784ea278f18c50"} Apr 24 22:30:18.842660 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:18.842634 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:30:18.843888 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:18.843862 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ff4l2" event={"ID":"ce0f15c5-e087-48f7-8284-7dcf34afed79","Type":"ContainerStarted","Data":"2bdad95c02c533cc927dbb6c3a3cc3a74856b13f9873745a44de4f43c047434d"} Apr 24 22:30:18.843979 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:18.843893 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ff4l2" event={"ID":"ce0f15c5-e087-48f7-8284-7dcf34afed79","Type":"ContainerStarted","Data":"db092068b728303131c61409191ee758adf23504a76b3f1aac6b59045dfd6235"} Apr 24 22:30:18.844754 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:18.844734 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-fmj5c" event={"ID":"7350313d-9fe6-48f6-b2bc-d71d2b19525f","Type":"ContainerStarted","Data":"fe19b9bc2ef16ead07adb20e59f5a9b58bdba979a2fd1e4ab4f7db2d22dd1c7b"} Apr 24 22:30:18.880476 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:18.880420 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-z9q4w" podStartSLOduration=7.329410426 podStartE2EDuration="38.880405547s" podCreationTimestamp="2026-04-24 22:29:40 +0000 UTC" firstStartedPulling="2026-04-24 22:29:43.258892687 +0000 UTC m=+3.202684009" lastFinishedPulling="2026-04-24 22:30:14.809887815 +0000 UTC m=+34.753679130" observedRunningTime="2026-04-24 22:30:18.87371107 +0000 UTC m=+38.817502398" watchObservedRunningTime="2026-04-24 22:30:18.880405547 +0000 UTC m=+38.824196877" Apr 24 22:30:18.899339 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:18.899291 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-7sw9z" podStartSLOduration=35.52949686 podStartE2EDuration="38.899280783s" podCreationTimestamp="2026-04-24 22:29:40 +0000 UTC" firstStartedPulling="2026-04-24 22:30:14.788346059 +0000 UTC m=+34.732137368" lastFinishedPulling="2026-04-24 22:30:18.158129982 +0000 UTC m=+38.101921291" observedRunningTime="2026-04-24 22:30:18.898643867 +0000 UTC m=+38.842435199" watchObservedRunningTime="2026-04-24 22:30:18.899280783 +0000 UTC m=+38.843072114" Apr 24 22:30:18.925462 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:18.925413 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ff4l2" podStartSLOduration=1.361352141 podStartE2EDuration="3.925397755s" podCreationTimestamp="2026-04-24 22:30:15 +0000 UTC" firstStartedPulling="2026-04-24 22:30:15.596195145 +0000 UTC m=+35.539986459" lastFinishedPulling="2026-04-24 22:30:18.160240759 +0000 UTC m=+38.104032073" observedRunningTime="2026-04-24 22:30:18.924928988 +0000 UTC m=+38.868720319" watchObservedRunningTime="2026-04-24 22:30:18.925397755 +0000 UTC m=+38.869189088" Apr 24 22:30:19.020022 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:19.019979 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s5brh\" (UID: \"4ffa9d0c-ccdc-4b8c-b83f-12076db312b8\") " pod="openshift-insights/insights-runtime-extractor-s5brh" Apr 24 22:30:19.020216 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:19.020152 2582 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 22:30:19.020216 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:19.020214 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-insights-runtime-extractor-tls podName:4ffa9d0c-ccdc-4b8c-b83f-12076db312b8 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:23.020195653 +0000 UTC m=+42.963986961 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-insights-runtime-extractor-tls") pod "insights-runtime-extractor-s5brh" (UID: "4ffa9d0c-ccdc-4b8c-b83f-12076db312b8") : secret "insights-runtime-extractor-tls" not found Apr 24 22:30:20.850480 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:20.850389 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-fmj5c" event={"ID":"7350313d-9fe6-48f6-b2bc-d71d2b19525f","Type":"ContainerStarted","Data":"bffc23c81f8e3e4894172226878a8d5caf8e3465008c55726bd56cc75953f4c3"} Apr 24 22:30:20.872717 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:20.872665 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-fmj5c" podStartSLOduration=2.6588370709999998 podStartE2EDuration="4.872648363s" podCreationTimestamp="2026-04-24 22:30:16 +0000 UTC" firstStartedPulling="2026-04-24 22:30:18.385236008 +0000 UTC m=+38.329027319" lastFinishedPulling="2026-04-24 22:30:20.599047297 +0000 UTC m=+40.542838611" observedRunningTime="2026-04-24 22:30:20.872041335 +0000 UTC m=+40.815832667" watchObservedRunningTime="2026-04-24 22:30:20.872648363 +0000 UTC m=+40.816439696" Apr 24 22:30:21.239917 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:21.239841 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-tls\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:21.239917 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:21.239892 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4fd3bad0-b406-44c2-b540-cbbaf1436e6d-cert\") pod \"ingress-canary-qchcv\" (UID: \"4fd3bad0-b406-44c2-b540-cbbaf1436e6d\") " pod="openshift-ingress-canary/ingress-canary-qchcv" Apr 24 22:30:21.240110 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:21.239921 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d25320e2-53da-44c0-bfb0-8ed3f795faf6-metrics-tls\") pod \"dns-default-vp89s\" (UID: \"d25320e2-53da-44c0-bfb0-8ed3f795faf6\") " pod="openshift-dns/dns-default-vp89s" Apr 24 22:30:21.240110 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:21.240020 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:21.240110 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:21.240032 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:21.240110 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:21.240021 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:21.240110 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:21.240102 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5568c976c5-5n4wk: secret "image-registry-tls" not found Apr 24 22:30:21.240110 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:21.240106 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fd3bad0-b406-44c2-b540-cbbaf1436e6d-cert podName:4fd3bad0-b406-44c2-b540-cbbaf1436e6d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:29.240085095 +0000 UTC m=+49.183876411 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4fd3bad0-b406-44c2-b540-cbbaf1436e6d-cert") pod "ingress-canary-qchcv" (UID: "4fd3bad0-b406-44c2-b540-cbbaf1436e6d") : secret "canary-serving-cert" not found Apr 24 22:30:21.240289 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:21.240126 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d25320e2-53da-44c0-bfb0-8ed3f795faf6-metrics-tls podName:d25320e2-53da-44c0-bfb0-8ed3f795faf6 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:29.240116558 +0000 UTC m=+49.183907874 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d25320e2-53da-44c0-bfb0-8ed3f795faf6-metrics-tls") pod "dns-default-vp89s" (UID: "d25320e2-53da-44c0-bfb0-8ed3f795faf6") : secret "dns-default-metrics-tls" not found Apr 24 22:30:21.240289 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:21.240140 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-tls podName:38193c2c-a2e9-44ba-9dd4-76687aaabfb3 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:29.240132279 +0000 UTC m=+49.183923595 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-tls") pod "image-registry-5568c976c5-5n4wk" (UID: "38193c2c-a2e9-44ba-9dd4-76687aaabfb3") : secret "image-registry-tls" not found Apr 24 22:30:23.055206 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:23.055161 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s5brh\" (UID: \"4ffa9d0c-ccdc-4b8c-b83f-12076db312b8\") " pod="openshift-insights/insights-runtime-extractor-s5brh" Apr 24 22:30:23.055590 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:23.055360 2582 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 22:30:23.055590 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:30:23.055447 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-insights-runtime-extractor-tls podName:4ffa9d0c-ccdc-4b8c-b83f-12076db312b8 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:31.05542338 +0000 UTC m=+50.999214695 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-insights-runtime-extractor-tls") pod "insights-runtime-extractor-s5brh" (UID: "4ffa9d0c-ccdc-4b8c-b83f-12076db312b8") : secret "insights-runtime-extractor-tls" not found Apr 24 22:30:25.575585 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:25.575543 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bcb4810e-da56-4f18-8ac9-65765230513d-original-pull-secret\") pod \"global-pull-secret-syncer-zvc7f\" (UID: \"bcb4810e-da56-4f18-8ac9-65765230513d\") " pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:30:25.579011 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:25.578976 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bcb4810e-da56-4f18-8ac9-65765230513d-original-pull-secret\") pod \"global-pull-secret-syncer-zvc7f\" (UID: \"bcb4810e-da56-4f18-8ac9-65765230513d\") " pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:30:25.657544 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:25.657510 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zvc7f" Apr 24 22:30:25.791463 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:25.791433 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-zvc7f"] Apr 24 22:30:25.794403 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:30:25.794377 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcb4810e_da56_4f18_8ac9_65765230513d.slice/crio-576c10349212b965b5749aa1f7323e64999707aac1ff95dc8e53dd719792fd52 WatchSource:0}: Error finding container 576c10349212b965b5749aa1f7323e64999707aac1ff95dc8e53dd719792fd52: Status 404 returned error can't find the container with id 576c10349212b965b5749aa1f7323e64999707aac1ff95dc8e53dd719792fd52 Apr 24 22:30:25.861300 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:25.861214 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-zvc7f" event={"ID":"bcb4810e-da56-4f18-8ac9-65765230513d","Type":"ContainerStarted","Data":"576c10349212b965b5749aa1f7323e64999707aac1ff95dc8e53dd719792fd52"} Apr 24 22:30:29.307036 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:29.306984 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-tls\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:29.307560 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:29.307086 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4fd3bad0-b406-44c2-b540-cbbaf1436e6d-cert\") pod \"ingress-canary-qchcv\" (UID: \"4fd3bad0-b406-44c2-b540-cbbaf1436e6d\") " pod="openshift-ingress-canary/ingress-canary-qchcv" Apr 24 22:30:29.307560 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:29.307132 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d25320e2-53da-44c0-bfb0-8ed3f795faf6-metrics-tls\") pod \"dns-default-vp89s\" (UID: \"d25320e2-53da-44c0-bfb0-8ed3f795faf6\") " pod="openshift-dns/dns-default-vp89s" Apr 24 22:30:29.310469 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:29.310212 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4fd3bad0-b406-44c2-b540-cbbaf1436e6d-cert\") pod \"ingress-canary-qchcv\" (UID: \"4fd3bad0-b406-44c2-b540-cbbaf1436e6d\") " pod="openshift-ingress-canary/ingress-canary-qchcv" Apr 24 22:30:29.310469 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:29.310244 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-tls\") pod \"image-registry-5568c976c5-5n4wk\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:29.310469 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:29.310428 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d25320e2-53da-44c0-bfb0-8ed3f795faf6-metrics-tls\") pod \"dns-default-vp89s\" (UID: \"d25320e2-53da-44c0-bfb0-8ed3f795faf6\") " pod="openshift-dns/dns-default-vp89s" Apr 24 22:30:29.338561 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:29.338526 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:29.352700 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:29.352665 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vp89s" Apr 24 22:30:29.359413 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:29.359386 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qchcv" Apr 24 22:30:29.805082 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:29.805031 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qchcv"] Apr 24 22:30:29.825387 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:29.823294 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vp89s"] Apr 24 22:30:29.839898 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:29.839867 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5568c976c5-5n4wk"] Apr 24 22:30:29.962454 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:30:29.962369 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fd3bad0_b406_44c2_b540_cbbaf1436e6d.slice/crio-d4b7610818d7e6e411aec5fffcf43bb6ef9b1f902c69f751223dc05ff8586a5e WatchSource:0}: Error finding container d4b7610818d7e6e411aec5fffcf43bb6ef9b1f902c69f751223dc05ff8586a5e: Status 404 returned error can't find the container with id d4b7610818d7e6e411aec5fffcf43bb6ef9b1f902c69f751223dc05ff8586a5e Apr 24 22:30:29.963025 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:30:29.962981 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd25320e2_53da_44c0_bfb0_8ed3f795faf6.slice/crio-9cf381c7aeef093437f397471732a2c5edc5efe4d091dcd7d67d3fba7350d8dd WatchSource:0}: Error finding container 9cf381c7aeef093437f397471732a2c5edc5efe4d091dcd7d67d3fba7350d8dd: Status 404 returned error can't find the container with id 9cf381c7aeef093437f397471732a2c5edc5efe4d091dcd7d67d3fba7350d8dd Apr 24 22:30:29.963955 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:30:29.963823 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38193c2c_a2e9_44ba_9dd4_76687aaabfb3.slice/crio-930cb7fbcfe7d40f19cce76844571061370d141988cc9f645fb13cb90d801f00 WatchSource:0}: Error finding container 930cb7fbcfe7d40f19cce76844571061370d141988cc9f645fb13cb90d801f00: Status 404 returned error can't find the container with id 930cb7fbcfe7d40f19cce76844571061370d141988cc9f645fb13cb90d801f00 Apr 24 22:30:30.873032 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:30.872989 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vp89s" event={"ID":"d25320e2-53da-44c0-bfb0-8ed3f795faf6","Type":"ContainerStarted","Data":"9cf381c7aeef093437f397471732a2c5edc5efe4d091dcd7d67d3fba7350d8dd"} Apr 24 22:30:30.874754 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:30.874719 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-zvc7f" event={"ID":"bcb4810e-da56-4f18-8ac9-65765230513d","Type":"ContainerStarted","Data":"6bf98e9d828e9023b5957481a95aa002b1d23d4e6db5435f86fee9dfb8f40959"} Apr 24 22:30:30.876079 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:30.876035 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qchcv" event={"ID":"4fd3bad0-b406-44c2-b540-cbbaf1436e6d","Type":"ContainerStarted","Data":"d4b7610818d7e6e411aec5fffcf43bb6ef9b1f902c69f751223dc05ff8586a5e"} Apr 24 22:30:30.877566 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:30.877537 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" event={"ID":"38193c2c-a2e9-44ba-9dd4-76687aaabfb3","Type":"ContainerStarted","Data":"5ecfad3b229811170eacacfa89b3991aa28fd41cbf2dc06e513ebe796827e1ce"} Apr 24 22:30:30.877676 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:30.877566 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" event={"ID":"38193c2c-a2e9-44ba-9dd4-76687aaabfb3","Type":"ContainerStarted","Data":"930cb7fbcfe7d40f19cce76844571061370d141988cc9f645fb13cb90d801f00"} Apr 24 22:30:30.877984 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:30.877968 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:30.894789 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:30.894732 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-zvc7f" podStartSLOduration=33.698782739 podStartE2EDuration="37.894717736s" podCreationTimestamp="2026-04-24 22:29:53 +0000 UTC" firstStartedPulling="2026-04-24 22:30:25.796091856 +0000 UTC m=+45.739883166" lastFinishedPulling="2026-04-24 22:30:29.992026854 +0000 UTC m=+49.935818163" observedRunningTime="2026-04-24 22:30:30.894431576 +0000 UTC m=+50.838222911" watchObservedRunningTime="2026-04-24 22:30:30.894717736 +0000 UTC m=+50.838509066" Apr 24 22:30:30.913861 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:30.913807 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" podStartSLOduration=39.913786578 podStartE2EDuration="39.913786578s" podCreationTimestamp="2026-04-24 22:29:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:30:30.912723586 +0000 UTC m=+50.856514921" watchObservedRunningTime="2026-04-24 22:30:30.913786578 +0000 UTC m=+50.857577910" Apr 24 22:30:31.124244 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:31.124157 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s5brh\" (UID: \"4ffa9d0c-ccdc-4b8c-b83f-12076db312b8\") " pod="openshift-insights/insights-runtime-extractor-s5brh" Apr 24 22:30:31.136665 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:31.136639 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4ffa9d0c-ccdc-4b8c-b83f-12076db312b8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s5brh\" (UID: \"4ffa9d0c-ccdc-4b8c-b83f-12076db312b8\") " pod="openshift-insights/insights-runtime-extractor-s5brh" Apr 24 22:30:31.174109 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:31.174069 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-s5brh" Apr 24 22:30:31.325974 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:31.325938 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-s5brh"] Apr 24 22:30:31.329561 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:30:31.329533 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ffa9d0c_ccdc_4b8c_b83f_12076db312b8.slice/crio-344a8685d6828f111c5c152311aecb6a0f3902ad6c3671944821d3862e55829b WatchSource:0}: Error finding container 344a8685d6828f111c5c152311aecb6a0f3902ad6c3671944821d3862e55829b: Status 404 returned error can't find the container with id 344a8685d6828f111c5c152311aecb6a0f3902ad6c3671944821d3862e55829b Apr 24 22:30:31.882266 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:31.882233 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s5brh" event={"ID":"4ffa9d0c-ccdc-4b8c-b83f-12076db312b8","Type":"ContainerStarted","Data":"bf50950a0cb91c5abe8fb879ebe91ad3af536f470a997548406a8da4c8757eaa"} Apr 24 22:30:31.882266 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:31.882269 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s5brh" event={"ID":"4ffa9d0c-ccdc-4b8c-b83f-12076db312b8","Type":"ContainerStarted","Data":"344a8685d6828f111c5c152311aecb6a0f3902ad6c3671944821d3862e55829b"} Apr 24 22:30:33.889199 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:33.889158 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s5brh" event={"ID":"4ffa9d0c-ccdc-4b8c-b83f-12076db312b8","Type":"ContainerStarted","Data":"19e38257ae7424f91749333e56e221ea896b80c67f1bdd75e5d5c49dfdde0f4c"} Apr 24 22:30:33.890664 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:33.890638 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vp89s" event={"ID":"d25320e2-53da-44c0-bfb0-8ed3f795faf6","Type":"ContainerStarted","Data":"c0e542ee543e042dd0487f93957b365b413687ecd53d12781ee0c3370347b010"} Apr 24 22:30:33.890664 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:33.890669 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vp89s" event={"ID":"d25320e2-53da-44c0-bfb0-8ed3f795faf6","Type":"ContainerStarted","Data":"3b844ba12d732e9a0db89a92500d24d27cdc29f4bc87caba5c3803a8c60f8f66"} Apr 24 22:30:33.890852 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:33.890751 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-vp89s" Apr 24 22:30:33.891877 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:33.891853 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qchcv" event={"ID":"4fd3bad0-b406-44c2-b540-cbbaf1436e6d","Type":"ContainerStarted","Data":"6556524f5f7a721bd2c398e7830a41a2db14d21092a66bd5a4f2c30b3ce4acd3"} Apr 24 22:30:33.910239 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:33.910196 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vp89s" podStartSLOduration=18.032491793 podStartE2EDuration="20.910181192s" podCreationTimestamp="2026-04-24 22:30:13 +0000 UTC" firstStartedPulling="2026-04-24 22:30:29.979501771 +0000 UTC m=+49.923293083" lastFinishedPulling="2026-04-24 22:30:32.85719117 +0000 UTC m=+52.800982482" observedRunningTime="2026-04-24 22:30:33.909237712 +0000 UTC m=+53.853029042" watchObservedRunningTime="2026-04-24 22:30:33.910181192 +0000 UTC m=+53.853972522" Apr 24 22:30:33.929643 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:33.929596 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qchcv" podStartSLOduration=18.04755028 podStartE2EDuration="20.929581355s" podCreationTimestamp="2026-04-24 22:30:13 +0000 UTC" firstStartedPulling="2026-04-24 22:30:29.979538358 +0000 UTC m=+49.923329682" lastFinishedPulling="2026-04-24 22:30:32.861569445 +0000 UTC m=+52.805360757" observedRunningTime="2026-04-24 22:30:33.928795973 +0000 UTC m=+53.872587305" watchObservedRunningTime="2026-04-24 22:30:33.929581355 +0000 UTC m=+53.873372686" Apr 24 22:30:35.898897 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:35.898859 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s5brh" event={"ID":"4ffa9d0c-ccdc-4b8c-b83f-12076db312b8","Type":"ContainerStarted","Data":"fadb7ed2478c495b3d44695a2784b8196049bce523ebe9f64a8319909edd220c"} Apr 24 22:30:35.918275 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:35.918228 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-s5brh" podStartSLOduration=16.91381937 podStartE2EDuration="20.918214288s" podCreationTimestamp="2026-04-24 22:30:15 +0000 UTC" firstStartedPulling="2026-04-24 22:30:31.46325883 +0000 UTC m=+51.407050156" lastFinishedPulling="2026-04-24 22:30:35.467653764 +0000 UTC m=+55.411445074" observedRunningTime="2026-04-24 22:30:35.91743018 +0000 UTC m=+55.861221510" watchObservedRunningTime="2026-04-24 22:30:35.918214288 +0000 UTC m=+55.862005618" Apr 24 22:30:36.303367 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.303335 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5568c976c5-5n4wk"] Apr 24 22:30:36.312780 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.312749 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-546sf"] Apr 24 22:30:36.333979 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.333951 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-546sf" Apr 24 22:30:36.336540 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.336518 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-qcn49\"" Apr 24 22:30:36.337772 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.337745 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 22:30:36.337873 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.337833 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 22:30:36.340711 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.340689 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-546sf"] Apr 24 22:30:36.361315 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.361286 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpr4n\" (UniqueName: \"kubernetes.io/projected/787faccd-4c44-470b-a814-aed056407d9a-kube-api-access-vpr4n\") pod \"downloads-6bcc868b7-546sf\" (UID: \"787faccd-4c44-470b-a814-aed056407d9a\") " pod="openshift-console/downloads-6bcc868b7-546sf" Apr 24 22:30:36.395992 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.395956 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5db684cb8c-4nqc9"] Apr 24 22:30:36.410679 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.410653 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.418727 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.418691 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5db684cb8c-4nqc9"] Apr 24 22:30:36.461909 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.461873 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f2c2d47-f231-4b74-8af2-1181e5558d09-trusted-ca\") pod \"image-registry-5db684cb8c-4nqc9\" (UID: \"4f2c2d47-f231-4b74-8af2-1181e5558d09\") " pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.461909 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.461907 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4f2c2d47-f231-4b74-8af2-1181e5558d09-registry-tls\") pod \"image-registry-5db684cb8c-4nqc9\" (UID: \"4f2c2d47-f231-4b74-8af2-1181e5558d09\") " pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.462133 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.461934 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4f2c2d47-f231-4b74-8af2-1181e5558d09-image-registry-private-configuration\") pod \"image-registry-5db684cb8c-4nqc9\" (UID: \"4f2c2d47-f231-4b74-8af2-1181e5558d09\") " pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.462133 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.462004 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9qz5\" (UniqueName: \"kubernetes.io/projected/4f2c2d47-f231-4b74-8af2-1181e5558d09-kube-api-access-q9qz5\") pod \"image-registry-5db684cb8c-4nqc9\" (UID: \"4f2c2d47-f231-4b74-8af2-1181e5558d09\") " pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.462133 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.462079 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4f2c2d47-f231-4b74-8af2-1181e5558d09-registry-certificates\") pod \"image-registry-5db684cb8c-4nqc9\" (UID: \"4f2c2d47-f231-4b74-8af2-1181e5558d09\") " pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.462133 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.462115 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4f2c2d47-f231-4b74-8af2-1181e5558d09-ca-trust-extracted\") pod \"image-registry-5db684cb8c-4nqc9\" (UID: \"4f2c2d47-f231-4b74-8af2-1181e5558d09\") " pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.462256 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.462135 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4f2c2d47-f231-4b74-8af2-1181e5558d09-installation-pull-secrets\") pod \"image-registry-5db684cb8c-4nqc9\" (UID: \"4f2c2d47-f231-4b74-8af2-1181e5558d09\") " pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.462256 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.462157 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpr4n\" (UniqueName: \"kubernetes.io/projected/787faccd-4c44-470b-a814-aed056407d9a-kube-api-access-vpr4n\") pod \"downloads-6bcc868b7-546sf\" (UID: \"787faccd-4c44-470b-a814-aed056407d9a\") " pod="openshift-console/downloads-6bcc868b7-546sf" Apr 24 22:30:36.462256 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.462210 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f2c2d47-f231-4b74-8af2-1181e5558d09-bound-sa-token\") pod \"image-registry-5db684cb8c-4nqc9\" (UID: \"4f2c2d47-f231-4b74-8af2-1181e5558d09\") " pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.477069 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.477040 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpr4n\" (UniqueName: \"kubernetes.io/projected/787faccd-4c44-470b-a814-aed056407d9a-kube-api-access-vpr4n\") pod \"downloads-6bcc868b7-546sf\" (UID: \"787faccd-4c44-470b-a814-aed056407d9a\") " pod="openshift-console/downloads-6bcc868b7-546sf" Apr 24 22:30:36.562809 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.562719 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f2c2d47-f231-4b74-8af2-1181e5558d09-bound-sa-token\") pod \"image-registry-5db684cb8c-4nqc9\" (UID: \"4f2c2d47-f231-4b74-8af2-1181e5558d09\") " pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.562809 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.562767 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f2c2d47-f231-4b74-8af2-1181e5558d09-trusted-ca\") pod \"image-registry-5db684cb8c-4nqc9\" (UID: \"4f2c2d47-f231-4b74-8af2-1181e5558d09\") " pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.563091 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.562890 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4f2c2d47-f231-4b74-8af2-1181e5558d09-registry-tls\") pod \"image-registry-5db684cb8c-4nqc9\" (UID: \"4f2c2d47-f231-4b74-8af2-1181e5558d09\") " pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.563091 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.562954 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4f2c2d47-f231-4b74-8af2-1181e5558d09-image-registry-private-configuration\") pod \"image-registry-5db684cb8c-4nqc9\" (UID: \"4f2c2d47-f231-4b74-8af2-1181e5558d09\") " pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.563091 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.562989 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q9qz5\" (UniqueName: \"kubernetes.io/projected/4f2c2d47-f231-4b74-8af2-1181e5558d09-kube-api-access-q9qz5\") pod \"image-registry-5db684cb8c-4nqc9\" (UID: \"4f2c2d47-f231-4b74-8af2-1181e5558d09\") " pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.563091 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.563085 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4f2c2d47-f231-4b74-8af2-1181e5558d09-registry-certificates\") pod \"image-registry-5db684cb8c-4nqc9\" (UID: \"4f2c2d47-f231-4b74-8af2-1181e5558d09\") " pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.563301 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.563145 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4f2c2d47-f231-4b74-8af2-1181e5558d09-ca-trust-extracted\") pod \"image-registry-5db684cb8c-4nqc9\" (UID: \"4f2c2d47-f231-4b74-8af2-1181e5558d09\") " pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.563301 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.563178 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4f2c2d47-f231-4b74-8af2-1181e5558d09-installation-pull-secrets\") pod \"image-registry-5db684cb8c-4nqc9\" (UID: \"4f2c2d47-f231-4b74-8af2-1181e5558d09\") " pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.563565 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.563530 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4f2c2d47-f231-4b74-8af2-1181e5558d09-ca-trust-extracted\") pod \"image-registry-5db684cb8c-4nqc9\" (UID: \"4f2c2d47-f231-4b74-8af2-1181e5558d09\") " pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.563962 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.563944 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4f2c2d47-f231-4b74-8af2-1181e5558d09-registry-certificates\") pod \"image-registry-5db684cb8c-4nqc9\" (UID: \"4f2c2d47-f231-4b74-8af2-1181e5558d09\") " pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.564199 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.564176 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f2c2d47-f231-4b74-8af2-1181e5558d09-trusted-ca\") pod \"image-registry-5db684cb8c-4nqc9\" (UID: \"4f2c2d47-f231-4b74-8af2-1181e5558d09\") " pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.565654 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.565637 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4f2c2d47-f231-4b74-8af2-1181e5558d09-registry-tls\") pod \"image-registry-5db684cb8c-4nqc9\" (UID: \"4f2c2d47-f231-4b74-8af2-1181e5558d09\") " pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.566018 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.565999 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4f2c2d47-f231-4b74-8af2-1181e5558d09-installation-pull-secrets\") pod \"image-registry-5db684cb8c-4nqc9\" (UID: \"4f2c2d47-f231-4b74-8af2-1181e5558d09\") " pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.566218 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.566197 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4f2c2d47-f231-4b74-8af2-1181e5558d09-image-registry-private-configuration\") pod \"image-registry-5db684cb8c-4nqc9\" (UID: \"4f2c2d47-f231-4b74-8af2-1181e5558d09\") " pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.580722 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.580685 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9qz5\" (UniqueName: \"kubernetes.io/projected/4f2c2d47-f231-4b74-8af2-1181e5558d09-kube-api-access-q9qz5\") pod \"image-registry-5db684cb8c-4nqc9\" (UID: \"4f2c2d47-f231-4b74-8af2-1181e5558d09\") " pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.587711 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.587676 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f2c2d47-f231-4b74-8af2-1181e5558d09-bound-sa-token\") pod \"image-registry-5db684cb8c-4nqc9\" (UID: \"4f2c2d47-f231-4b74-8af2-1181e5558d09\") " pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.643658 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.643623 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-546sf" Apr 24 22:30:36.720320 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.720289 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:36.804964 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.804930 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-546sf"] Apr 24 22:30:36.808359 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:30:36.808325 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod787faccd_4c44_470b_a814_aed056407d9a.slice/crio-f41de328a47df792bfb14e2575d55095cda74427c3f4218f21d69b368ef16cb9 WatchSource:0}: Error finding container f41de328a47df792bfb14e2575d55095cda74427c3f4218f21d69b368ef16cb9: Status 404 returned error can't find the container with id f41de328a47df792bfb14e2575d55095cda74427c3f4218f21d69b368ef16cb9 Apr 24 22:30:36.903020 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.902986 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-546sf" event={"ID":"787faccd-4c44-470b-a814-aed056407d9a","Type":"ContainerStarted","Data":"f41de328a47df792bfb14e2575d55095cda74427c3f4218f21d69b368ef16cb9"} Apr 24 22:30:36.911452 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:36.911420 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5db684cb8c-4nqc9"] Apr 24 22:30:36.914825 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:30:36.914799 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f2c2d47_f231_4b74_8af2_1181e5558d09.slice/crio-5a8b94a44a2c9ef5da4ae6799d0dc75c9d7f6d46c3c0f561a9bd81dd143a382a WatchSource:0}: Error finding container 5a8b94a44a2c9ef5da4ae6799d0dc75c9d7f6d46c3c0f561a9bd81dd143a382a: Status 404 returned error can't find the container with id 5a8b94a44a2c9ef5da4ae6799d0dc75c9d7f6d46c3c0f561a9bd81dd143a382a Apr 24 22:30:37.907676 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:37.907602 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" event={"ID":"4f2c2d47-f231-4b74-8af2-1181e5558d09","Type":"ContainerStarted","Data":"468aef4865c2f389803ec37292a3404693d15d6966f7c00373425ce70221f513"} Apr 24 22:30:37.907676 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:37.907649 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" event={"ID":"4f2c2d47-f231-4b74-8af2-1181e5558d09","Type":"ContainerStarted","Data":"5a8b94a44a2c9ef5da4ae6799d0dc75c9d7f6d46c3c0f561a9bd81dd143a382a"} Apr 24 22:30:37.908181 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:37.907745 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:30:37.928812 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:37.928763 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" podStartSLOduration=1.9287446940000001 podStartE2EDuration="1.928744694s" podCreationTimestamp="2026-04-24 22:30:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:30:37.927204554 +0000 UTC m=+57.870995884" watchObservedRunningTime="2026-04-24 22:30:37.928744694 +0000 UTC m=+57.872536027" Apr 24 22:30:39.824722 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:39.824685 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dzsgr" Apr 24 22:30:43.898075 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:43.898029 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vp89s" Apr 24 22:30:46.310269 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:46.310228 2582 patch_prober.go:28] interesting pod/image-registry-5568c976c5-5n4wk container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 22:30:46.310678 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:46.310299 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" podUID="38193c2c-a2e9-44ba-9dd4-76687aaabfb3" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:30:46.348072 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:46.348031 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/171d0bdf-1d87-4aee-9fad-9c28075596bd-metrics-certs\") pod \"network-metrics-daemon-hgvbb\" (UID: \"171d0bdf-1d87-4aee-9fad-9c28075596bd\") " pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:30:46.350915 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:46.350886 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/171d0bdf-1d87-4aee-9fad-9c28075596bd-metrics-certs\") pod \"network-metrics-daemon-hgvbb\" (UID: \"171d0bdf-1d87-4aee-9fad-9c28075596bd\") " pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:30:46.645981 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:46.645894 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-74npb\"" Apr 24 22:30:46.654159 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:46.654134 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgvbb" Apr 24 22:30:46.811915 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:30:46.811876 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod171d0bdf_1d87_4aee_9fad_9c28075596bd.slice/crio-364314e30a17a753b791f8d2a1f79163f9b9ecf316ef6f9d5ce01247a6c96c43 WatchSource:0}: Error finding container 364314e30a17a753b791f8d2a1f79163f9b9ecf316ef6f9d5ce01247a6c96c43: Status 404 returned error can't find the container with id 364314e30a17a753b791f8d2a1f79163f9b9ecf316ef6f9d5ce01247a6c96c43 Apr 24 22:30:46.814938 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:46.814887 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hgvbb"] Apr 24 22:30:46.932462 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:46.932375 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hgvbb" event={"ID":"171d0bdf-1d87-4aee-9fad-9c28075596bd","Type":"ContainerStarted","Data":"364314e30a17a753b791f8d2a1f79163f9b9ecf316ef6f9d5ce01247a6c96c43"} Apr 24 22:30:49.850586 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:49.850556 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-7sw9z" Apr 24 22:30:54.955886 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:54.955841 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-546sf" event={"ID":"787faccd-4c44-470b-a814-aed056407d9a","Type":"ContainerStarted","Data":"7aeb8a5ed9e1ab75b4f67d6f5950c70baad15c43ae2aafcff1db19838768fa83"} Apr 24 22:30:54.956356 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:54.956097 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-546sf" Apr 24 22:30:54.970157 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:54.970120 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-546sf" Apr 24 22:30:54.995780 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:54.995653 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-546sf" podStartSLOduration=1.62391537 podStartE2EDuration="18.995639163s" podCreationTimestamp="2026-04-24 22:30:36 +0000 UTC" firstStartedPulling="2026-04-24 22:30:36.810799127 +0000 UTC m=+56.754590436" lastFinishedPulling="2026-04-24 22:30:54.182522918 +0000 UTC m=+74.126314229" observedRunningTime="2026-04-24 22:30:54.994243947 +0000 UTC m=+74.938035321" watchObservedRunningTime="2026-04-24 22:30:54.995639163 +0000 UTC m=+74.939430495" Apr 24 22:30:55.961101 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:55.961046 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hgvbb" event={"ID":"171d0bdf-1d87-4aee-9fad-9c28075596bd","Type":"ContainerStarted","Data":"06cf19cb60a13eb69d5dd8c5d30b5b0a8dfd53fdda9eaf7e7a6d36333752b764"} Apr 24 22:30:55.961514 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:55.961107 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hgvbb" event={"ID":"171d0bdf-1d87-4aee-9fad-9c28075596bd","Type":"ContainerStarted","Data":"f72e39a86c2381dd7f7ea05fa128d07ccb241b828daff08aab5bffe77e05d729"} Apr 24 22:30:55.984311 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:55.984257 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-hgvbb" podStartSLOduration=67.874868585 podStartE2EDuration="1m15.984242713s" podCreationTimestamp="2026-04-24 22:29:40 +0000 UTC" firstStartedPulling="2026-04-24 22:30:46.815406097 +0000 UTC m=+66.759197406" lastFinishedPulling="2026-04-24 22:30:54.924780221 +0000 UTC m=+74.868571534" observedRunningTime="2026-04-24 22:30:55.983573241 +0000 UTC m=+75.927364572" watchObservedRunningTime="2026-04-24 22:30:55.984242713 +0000 UTC m=+75.928034044" Apr 24 22:30:56.309446 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:56.309415 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:30:58.914679 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:30:58.914546 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5db684cb8c-4nqc9" Apr 24 22:31:01.322583 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.322503 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" podUID="38193c2c-a2e9-44ba-9dd4-76687aaabfb3" containerName="registry" containerID="cri-o://5ecfad3b229811170eacacfa89b3991aa28fd41cbf2dc06e513ebe796827e1ce" gracePeriod=30 Apr 24 22:31:01.601129 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.601099 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:31:01.677755 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.677718 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-installation-pull-secrets\") pod \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " Apr 24 22:31:01.677952 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.677779 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-tls\") pod \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " Apr 24 22:31:01.677952 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.677813 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-certificates\") pod \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " Apr 24 22:31:01.677952 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.677843 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5q75\" (UniqueName: \"kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-kube-api-access-x5q75\") pod \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " Apr 24 22:31:01.677952 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.677873 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-image-registry-private-configuration\") pod \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " Apr 24 22:31:01.677952 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.677901 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-ca-trust-extracted\") pod \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " Apr 24 22:31:01.677952 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.677928 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-bound-sa-token\") pod \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " Apr 24 22:31:01.677952 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.677954 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-trusted-ca\") pod \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\" (UID: \"38193c2c-a2e9-44ba-9dd4-76687aaabfb3\") " Apr 24 22:31:01.678494 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.678464 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "38193c2c-a2e9-44ba-9dd4-76687aaabfb3" (UID: "38193c2c-a2e9-44ba-9dd4-76687aaabfb3"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:31:01.678672 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.678475 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "38193c2c-a2e9-44ba-9dd4-76687aaabfb3" (UID: "38193c2c-a2e9-44ba-9dd4-76687aaabfb3"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:31:01.680976 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.680937 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "38193c2c-a2e9-44ba-9dd4-76687aaabfb3" (UID: "38193c2c-a2e9-44ba-9dd4-76687aaabfb3"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:31:01.681223 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.681165 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-kube-api-access-x5q75" (OuterVolumeSpecName: "kube-api-access-x5q75") pod "38193c2c-a2e9-44ba-9dd4-76687aaabfb3" (UID: "38193c2c-a2e9-44ba-9dd4-76687aaabfb3"). InnerVolumeSpecName "kube-api-access-x5q75". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:31:01.681303 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.681264 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "38193c2c-a2e9-44ba-9dd4-76687aaabfb3" (UID: "38193c2c-a2e9-44ba-9dd4-76687aaabfb3"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:31:01.681574 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.681547 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "38193c2c-a2e9-44ba-9dd4-76687aaabfb3" (UID: "38193c2c-a2e9-44ba-9dd4-76687aaabfb3"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:31:01.681649 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.681587 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "38193c2c-a2e9-44ba-9dd4-76687aaabfb3" (UID: "38193c2c-a2e9-44ba-9dd4-76687aaabfb3"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:31:01.689247 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.689208 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "38193c2c-a2e9-44ba-9dd4-76687aaabfb3" (UID: "38193c2c-a2e9-44ba-9dd4-76687aaabfb3"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:31:01.779504 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.779463 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x5q75\" (UniqueName: \"kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-kube-api-access-x5q75\") on node \"ip-10-0-135-222.ec2.internal\" DevicePath \"\"" Apr 24 22:31:01.779504 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.779505 2582 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-image-registry-private-configuration\") on node \"ip-10-0-135-222.ec2.internal\" DevicePath \"\"" Apr 24 22:31:01.779715 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.779521 2582 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-ca-trust-extracted\") on node \"ip-10-0-135-222.ec2.internal\" DevicePath \"\"" Apr 24 22:31:01.779715 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.779536 2582 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-bound-sa-token\") on node \"ip-10-0-135-222.ec2.internal\" DevicePath \"\"" Apr 24 22:31:01.779715 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.779552 2582 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-trusted-ca\") on node \"ip-10-0-135-222.ec2.internal\" DevicePath \"\"" Apr 24 22:31:01.779715 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.779565 2582 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-installation-pull-secrets\") on node \"ip-10-0-135-222.ec2.internal\" DevicePath \"\"" Apr 24 22:31:01.779715 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.779579 2582 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-tls\") on node \"ip-10-0-135-222.ec2.internal\" DevicePath \"\"" Apr 24 22:31:01.779715 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.779594 2582 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/38193c2c-a2e9-44ba-9dd4-76687aaabfb3-registry-certificates\") on node \"ip-10-0-135-222.ec2.internal\" DevicePath \"\"" Apr 24 22:31:01.983670 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.983575 2582 generic.go:358] "Generic (PLEG): container finished" podID="38193c2c-a2e9-44ba-9dd4-76687aaabfb3" containerID="5ecfad3b229811170eacacfa89b3991aa28fd41cbf2dc06e513ebe796827e1ce" exitCode=0 Apr 24 22:31:01.983670 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.983626 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" event={"ID":"38193c2c-a2e9-44ba-9dd4-76687aaabfb3","Type":"ContainerDied","Data":"5ecfad3b229811170eacacfa89b3991aa28fd41cbf2dc06e513ebe796827e1ce"} Apr 24 22:31:01.983670 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.983654 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" Apr 24 22:31:01.983670 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.983665 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5568c976c5-5n4wk" event={"ID":"38193c2c-a2e9-44ba-9dd4-76687aaabfb3","Type":"ContainerDied","Data":"930cb7fbcfe7d40f19cce76844571061370d141988cc9f645fb13cb90d801f00"} Apr 24 22:31:01.983962 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.983686 2582 scope.go:117] "RemoveContainer" containerID="5ecfad3b229811170eacacfa89b3991aa28fd41cbf2dc06e513ebe796827e1ce" Apr 24 22:31:01.993364 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.993341 2582 scope.go:117] "RemoveContainer" containerID="5ecfad3b229811170eacacfa89b3991aa28fd41cbf2dc06e513ebe796827e1ce" Apr 24 22:31:01.993680 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:31:01.993655 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ecfad3b229811170eacacfa89b3991aa28fd41cbf2dc06e513ebe796827e1ce\": container with ID starting with 5ecfad3b229811170eacacfa89b3991aa28fd41cbf2dc06e513ebe796827e1ce not found: ID does not exist" containerID="5ecfad3b229811170eacacfa89b3991aa28fd41cbf2dc06e513ebe796827e1ce" Apr 24 22:31:01.993754 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:01.993693 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ecfad3b229811170eacacfa89b3991aa28fd41cbf2dc06e513ebe796827e1ce"} err="failed to get container status \"5ecfad3b229811170eacacfa89b3991aa28fd41cbf2dc06e513ebe796827e1ce\": rpc error: code = NotFound desc = could not find container \"5ecfad3b229811170eacacfa89b3991aa28fd41cbf2dc06e513ebe796827e1ce\": container with ID starting with 5ecfad3b229811170eacacfa89b3991aa28fd41cbf2dc06e513ebe796827e1ce not found: ID does not exist" Apr 24 22:31:02.004617 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:02.004583 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5568c976c5-5n4wk"] Apr 24 22:31:02.008755 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:02.008726 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5568c976c5-5n4wk"] Apr 24 22:31:02.637915 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:02.637878 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38193c2c-a2e9-44ba-9dd4-76687aaabfb3" path="/var/lib/kubelet/pods/38193c2c-a2e9-44ba-9dd4-76687aaabfb3/volumes" Apr 24 22:31:13.567529 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.567492 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-88ffs"] Apr 24 22:31:13.568130 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.568040 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38193c2c-a2e9-44ba-9dd4-76687aaabfb3" containerName="registry" Apr 24 22:31:13.568130 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.568089 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="38193c2c-a2e9-44ba-9dd4-76687aaabfb3" containerName="registry" Apr 24 22:31:13.568262 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.568161 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="38193c2c-a2e9-44ba-9dd4-76687aaabfb3" containerName="registry" Apr 24 22:31:13.598088 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.598044 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.608856 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.608823 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 22:31:13.609151 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.609133 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 22:31:13.609304 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.609290 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-8plhd\"" Apr 24 22:31:13.609977 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.609961 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 22:31:13.610025 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.609974 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 22:31:13.611167 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.611045 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 22:31:13.615493 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.615473 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 22:31:13.674488 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.674446 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cbe451bd-f49e-4fe0-8e55-929f1582a45d-root\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.674688 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.674500 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cbe451bd-f49e-4fe0-8e55-929f1582a45d-node-exporter-textfile\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.674688 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.674533 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxrmr\" (UniqueName: \"kubernetes.io/projected/cbe451bd-f49e-4fe0-8e55-929f1582a45d-kube-api-access-qxrmr\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.674688 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.674564 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cbe451bd-f49e-4fe0-8e55-929f1582a45d-node-exporter-accelerators-collector-config\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.674688 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.674591 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cbe451bd-f49e-4fe0-8e55-929f1582a45d-node-exporter-wtmp\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.674688 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.674677 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cbe451bd-f49e-4fe0-8e55-929f1582a45d-node-exporter-tls\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.675023 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.674723 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cbe451bd-f49e-4fe0-8e55-929f1582a45d-metrics-client-ca\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.675023 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.674776 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cbe451bd-f49e-4fe0-8e55-929f1582a45d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.675023 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.674812 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cbe451bd-f49e-4fe0-8e55-929f1582a45d-sys\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.776111 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.776037 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cbe451bd-f49e-4fe0-8e55-929f1582a45d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.776298 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.776135 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cbe451bd-f49e-4fe0-8e55-929f1582a45d-sys\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.776298 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.776165 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cbe451bd-f49e-4fe0-8e55-929f1582a45d-root\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.776298 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.776191 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cbe451bd-f49e-4fe0-8e55-929f1582a45d-node-exporter-textfile\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.776298 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.776213 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxrmr\" (UniqueName: \"kubernetes.io/projected/cbe451bd-f49e-4fe0-8e55-929f1582a45d-kube-api-access-qxrmr\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.776298 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.776241 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cbe451bd-f49e-4fe0-8e55-929f1582a45d-node-exporter-accelerators-collector-config\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.776298 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.776268 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cbe451bd-f49e-4fe0-8e55-929f1582a45d-node-exporter-wtmp\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.776298 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.776273 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cbe451bd-f49e-4fe0-8e55-929f1582a45d-root\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.776680 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.776276 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cbe451bd-f49e-4fe0-8e55-929f1582a45d-sys\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.776680 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.776311 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cbe451bd-f49e-4fe0-8e55-929f1582a45d-node-exporter-tls\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.776680 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.776366 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cbe451bd-f49e-4fe0-8e55-929f1582a45d-metrics-client-ca\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.776680 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.776553 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cbe451bd-f49e-4fe0-8e55-929f1582a45d-node-exporter-textfile\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.776875 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.776690 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cbe451bd-f49e-4fe0-8e55-929f1582a45d-node-exporter-wtmp\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.777108 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.777085 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cbe451bd-f49e-4fe0-8e55-929f1582a45d-node-exporter-accelerators-collector-config\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.777338 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.777317 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cbe451bd-f49e-4fe0-8e55-929f1582a45d-metrics-client-ca\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.778779 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.778753 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cbe451bd-f49e-4fe0-8e55-929f1582a45d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.779108 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.779050 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cbe451bd-f49e-4fe0-8e55-929f1582a45d-node-exporter-tls\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.786470 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.786437 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxrmr\" (UniqueName: \"kubernetes.io/projected/cbe451bd-f49e-4fe0-8e55-929f1582a45d-kube-api-access-qxrmr\") pod \"node-exporter-88ffs\" (UID: \"cbe451bd-f49e-4fe0-8e55-929f1582a45d\") " pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.908580 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:13.908491 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-88ffs" Apr 24 22:31:13.917765 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:31:13.917734 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbe451bd_f49e_4fe0_8e55_929f1582a45d.slice/crio-07cf0ed83a3d6ebc79dfa062ab64af18586e72115b3ca8cb2b17d5ee8b42abd0 WatchSource:0}: Error finding container 07cf0ed83a3d6ebc79dfa062ab64af18586e72115b3ca8cb2b17d5ee8b42abd0: Status 404 returned error can't find the container with id 07cf0ed83a3d6ebc79dfa062ab64af18586e72115b3ca8cb2b17d5ee8b42abd0 Apr 24 22:31:14.019177 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:14.019138 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-88ffs" event={"ID":"cbe451bd-f49e-4fe0-8e55-929f1582a45d","Type":"ContainerStarted","Data":"07cf0ed83a3d6ebc79dfa062ab64af18586e72115b3ca8cb2b17d5ee8b42abd0"} Apr 24 22:31:15.640170 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.640124 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-77497b7776-fcp8g"] Apr 24 22:31:15.660872 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.660840 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.661815 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.661790 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-77497b7776-fcp8g"] Apr 24 22:31:15.662696 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.662676 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 22:31:15.662813 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.662676 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-m8bgr\"" Apr 24 22:31:15.662813 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.662741 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-6a1pk23ddq7so\"" Apr 24 22:31:15.663341 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.663201 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 22:31:15.667743 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.663485 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 22:31:15.667743 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.663569 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 22:31:15.667914 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.667890 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 22:31:15.794317 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.794277 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/4f291ad5-a866-4e1b-a40c-5f48012a9705-secret-thanos-querier-tls\") pod \"thanos-querier-77497b7776-fcp8g\" (UID: \"4f291ad5-a866-4e1b-a40c-5f48012a9705\") " pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.794317 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.794330 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/4f291ad5-a866-4e1b-a40c-5f48012a9705-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-77497b7776-fcp8g\" (UID: \"4f291ad5-a866-4e1b-a40c-5f48012a9705\") " pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.794579 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.794363 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4f291ad5-a866-4e1b-a40c-5f48012a9705-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-77497b7776-fcp8g\" (UID: \"4f291ad5-a866-4e1b-a40c-5f48012a9705\") " pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.794579 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.794443 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4f291ad5-a866-4e1b-a40c-5f48012a9705-secret-grpc-tls\") pod \"thanos-querier-77497b7776-fcp8g\" (UID: \"4f291ad5-a866-4e1b-a40c-5f48012a9705\") " pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.794579 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.794474 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p69k6\" (UniqueName: \"kubernetes.io/projected/4f291ad5-a866-4e1b-a40c-5f48012a9705-kube-api-access-p69k6\") pod \"thanos-querier-77497b7776-fcp8g\" (UID: \"4f291ad5-a866-4e1b-a40c-5f48012a9705\") " pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.794579 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.794507 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4f291ad5-a866-4e1b-a40c-5f48012a9705-metrics-client-ca\") pod \"thanos-querier-77497b7776-fcp8g\" (UID: \"4f291ad5-a866-4e1b-a40c-5f48012a9705\") " pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.794579 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.794549 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4f291ad5-a866-4e1b-a40c-5f48012a9705-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-77497b7776-fcp8g\" (UID: \"4f291ad5-a866-4e1b-a40c-5f48012a9705\") " pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.794757 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.794585 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/4f291ad5-a866-4e1b-a40c-5f48012a9705-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-77497b7776-fcp8g\" (UID: \"4f291ad5-a866-4e1b-a40c-5f48012a9705\") " pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.895359 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.895275 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/4f291ad5-a866-4e1b-a40c-5f48012a9705-secret-thanos-querier-tls\") pod \"thanos-querier-77497b7776-fcp8g\" (UID: \"4f291ad5-a866-4e1b-a40c-5f48012a9705\") " pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.895359 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.895320 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/4f291ad5-a866-4e1b-a40c-5f48012a9705-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-77497b7776-fcp8g\" (UID: \"4f291ad5-a866-4e1b-a40c-5f48012a9705\") " pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.895359 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.895342 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4f291ad5-a866-4e1b-a40c-5f48012a9705-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-77497b7776-fcp8g\" (UID: \"4f291ad5-a866-4e1b-a40c-5f48012a9705\") " pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.895594 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.895384 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4f291ad5-a866-4e1b-a40c-5f48012a9705-secret-grpc-tls\") pod \"thanos-querier-77497b7776-fcp8g\" (UID: \"4f291ad5-a866-4e1b-a40c-5f48012a9705\") " pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.895594 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.895562 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p69k6\" (UniqueName: \"kubernetes.io/projected/4f291ad5-a866-4e1b-a40c-5f48012a9705-kube-api-access-p69k6\") pod \"thanos-querier-77497b7776-fcp8g\" (UID: \"4f291ad5-a866-4e1b-a40c-5f48012a9705\") " pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.895694 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.895611 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4f291ad5-a866-4e1b-a40c-5f48012a9705-metrics-client-ca\") pod \"thanos-querier-77497b7776-fcp8g\" (UID: \"4f291ad5-a866-4e1b-a40c-5f48012a9705\") " pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.895694 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.895649 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4f291ad5-a866-4e1b-a40c-5f48012a9705-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-77497b7776-fcp8g\" (UID: \"4f291ad5-a866-4e1b-a40c-5f48012a9705\") " pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.895694 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.895669 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/4f291ad5-a866-4e1b-a40c-5f48012a9705-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-77497b7776-fcp8g\" (UID: \"4f291ad5-a866-4e1b-a40c-5f48012a9705\") " pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.896685 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.896657 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4f291ad5-a866-4e1b-a40c-5f48012a9705-metrics-client-ca\") pod \"thanos-querier-77497b7776-fcp8g\" (UID: \"4f291ad5-a866-4e1b-a40c-5f48012a9705\") " pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.898495 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.898472 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/4f291ad5-a866-4e1b-a40c-5f48012a9705-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-77497b7776-fcp8g\" (UID: \"4f291ad5-a866-4e1b-a40c-5f48012a9705\") " pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.898598 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.898472 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4f291ad5-a866-4e1b-a40c-5f48012a9705-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-77497b7776-fcp8g\" (UID: \"4f291ad5-a866-4e1b-a40c-5f48012a9705\") " pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.898642 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.898619 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/4f291ad5-a866-4e1b-a40c-5f48012a9705-secret-thanos-querier-tls\") pod \"thanos-querier-77497b7776-fcp8g\" (UID: \"4f291ad5-a866-4e1b-a40c-5f48012a9705\") " pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.898848 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.898825 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4f291ad5-a866-4e1b-a40c-5f48012a9705-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-77497b7776-fcp8g\" (UID: \"4f291ad5-a866-4e1b-a40c-5f48012a9705\") " pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.898881 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.898838 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/4f291ad5-a866-4e1b-a40c-5f48012a9705-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-77497b7776-fcp8g\" (UID: \"4f291ad5-a866-4e1b-a40c-5f48012a9705\") " pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.898881 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.898853 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4f291ad5-a866-4e1b-a40c-5f48012a9705-secret-grpc-tls\") pod \"thanos-querier-77497b7776-fcp8g\" (UID: \"4f291ad5-a866-4e1b-a40c-5f48012a9705\") " pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.904685 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.904662 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p69k6\" (UniqueName: \"kubernetes.io/projected/4f291ad5-a866-4e1b-a40c-5f48012a9705-kube-api-access-p69k6\") pod \"thanos-querier-77497b7776-fcp8g\" (UID: \"4f291ad5-a866-4e1b-a40c-5f48012a9705\") " pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:15.973389 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:15.973351 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:16.026792 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:16.026750 2582 generic.go:358] "Generic (PLEG): container finished" podID="cbe451bd-f49e-4fe0-8e55-929f1582a45d" containerID="d9b192e2ede044d0fd1c71844e951845dd35a263cd5d2fef65c89937568c08eb" exitCode=0 Apr 24 22:31:16.026914 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:16.026816 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-88ffs" event={"ID":"cbe451bd-f49e-4fe0-8e55-929f1582a45d","Type":"ContainerDied","Data":"d9b192e2ede044d0fd1c71844e951845dd35a263cd5d2fef65c89937568c08eb"} Apr 24 22:31:16.105390 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:16.105357 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-77497b7776-fcp8g"] Apr 24 22:31:16.108400 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:31:16.108375 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f291ad5_a866_4e1b_a40c_5f48012a9705.slice/crio-9e1acdcbd3ab15eebd7b6049907ff9210c1b957a0d619063613ece0652867d31 WatchSource:0}: Error finding container 9e1acdcbd3ab15eebd7b6049907ff9210c1b957a0d619063613ece0652867d31: Status 404 returned error can't find the container with id 9e1acdcbd3ab15eebd7b6049907ff9210c1b957a0d619063613ece0652867d31 Apr 24 22:31:17.031917 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:17.031874 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-88ffs" event={"ID":"cbe451bd-f49e-4fe0-8e55-929f1582a45d","Type":"ContainerStarted","Data":"52868e4640fdf0bf7ccd90036571f8b720302de0cbc8b90392972966a159f0c7"} Apr 24 22:31:17.031917 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:17.031919 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-88ffs" event={"ID":"cbe451bd-f49e-4fe0-8e55-929f1582a45d","Type":"ContainerStarted","Data":"3f52ca12cad3b4057574dddaeed3af3bbb06c2527a5b427878cf87cfb28abf36"} Apr 24 22:31:17.033513 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:17.033477 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" event={"ID":"4f291ad5-a866-4e1b-a40c-5f48012a9705","Type":"ContainerStarted","Data":"9e1acdcbd3ab15eebd7b6049907ff9210c1b957a0d619063613ece0652867d31"} Apr 24 22:31:17.055094 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:17.055020 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-88ffs" podStartSLOduration=2.808096778 podStartE2EDuration="4.055001858s" podCreationTimestamp="2026-04-24 22:31:13 +0000 UTC" firstStartedPulling="2026-04-24 22:31:13.919953302 +0000 UTC m=+93.863744611" lastFinishedPulling="2026-04-24 22:31:15.166858382 +0000 UTC m=+95.110649691" observedRunningTime="2026-04-24 22:31:17.053236232 +0000 UTC m=+96.997027635" watchObservedRunningTime="2026-04-24 22:31:17.055001858 +0000 UTC m=+96.998793193" Apr 24 22:31:18.039580 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:18.039545 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" event={"ID":"4f291ad5-a866-4e1b-a40c-5f48012a9705","Type":"ContainerStarted","Data":"c809c936b674825829bfa04347166048b5cab1b70753fd5862c8501fdaa18b24"} Apr 24 22:31:18.039580 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:18.039585 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" event={"ID":"4f291ad5-a866-4e1b-a40c-5f48012a9705","Type":"ContainerStarted","Data":"b176ff1410417b0a032b5404384340139fec73e1d73819e6fab469b4b575802f"} Apr 24 22:31:18.040007 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:18.039594 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" event={"ID":"4f291ad5-a866-4e1b-a40c-5f48012a9705","Type":"ContainerStarted","Data":"e38e807f578f442277f78cd73e051055af1282da68fb5a5de71fa287672c4753"} Apr 24 22:31:18.280724 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:18.280638 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-kq27h"] Apr 24 22:31:18.308149 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:18.308099 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-kq27h"] Apr 24 22:31:18.308335 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:18.308300 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kq27h" Apr 24 22:31:18.313001 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:18.312970 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-tdndz\"" Apr 24 22:31:18.313001 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:18.312971 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 24 22:31:18.415978 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:18.415942 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4a22403d-f328-4bbf-a4a4-63ec0876fc5a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-kq27h\" (UID: \"4a22403d-f328-4bbf-a4a4-63ec0876fc5a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kq27h" Apr 24 22:31:18.516775 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:18.516731 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4a22403d-f328-4bbf-a4a4-63ec0876fc5a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-kq27h\" (UID: \"4a22403d-f328-4bbf-a4a4-63ec0876fc5a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kq27h" Apr 24 22:31:18.519829 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:18.519801 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4a22403d-f328-4bbf-a4a4-63ec0876fc5a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-kq27h\" (UID: \"4a22403d-f328-4bbf-a4a4-63ec0876fc5a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kq27h" Apr 24 22:31:18.617831 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:18.617791 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kq27h" Apr 24 22:31:18.756027 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:18.755985 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-kq27h"] Apr 24 22:31:18.830888 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:31:18.830844 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a22403d_f328_4bbf_a4a4_63ec0876fc5a.slice/crio-518cd3a6ee10c524a22aa979748319b1175e2f70dcbfba189c9a93bc85cc5889 WatchSource:0}: Error finding container 518cd3a6ee10c524a22aa979748319b1175e2f70dcbfba189c9a93bc85cc5889: Status 404 returned error can't find the container with id 518cd3a6ee10c524a22aa979748319b1175e2f70dcbfba189c9a93bc85cc5889 Apr 24 22:31:19.044184 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:19.044142 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kq27h" event={"ID":"4a22403d-f328-4bbf-a4a4-63ec0876fc5a","Type":"ContainerStarted","Data":"518cd3a6ee10c524a22aa979748319b1175e2f70dcbfba189c9a93bc85cc5889"} Apr 24 22:31:20.049455 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:20.049406 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" event={"ID":"4f291ad5-a866-4e1b-a40c-5f48012a9705","Type":"ContainerStarted","Data":"9eb6709cf8e164b603b39579467fc3ec11a2281554bd83f4613ff4ef09372da3"} Apr 24 22:31:20.049455 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:20.049454 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" event={"ID":"4f291ad5-a866-4e1b-a40c-5f48012a9705","Type":"ContainerStarted","Data":"992950ccbf85e65162274fb735fe93b58fe69acd4566c625def7237ecf0c2396"} Apr 24 22:31:20.049971 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:20.049466 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" event={"ID":"4f291ad5-a866-4e1b-a40c-5f48012a9705","Type":"ContainerStarted","Data":"84310730766b8029d070f136ce6193b070edb646503f7d5d7a44379c17faf2d2"} Apr 24 22:31:20.049971 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:20.049615 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:31:20.074554 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:20.074141 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" podStartSLOduration=2.092333067 podStartE2EDuration="5.07412169s" podCreationTimestamp="2026-04-24 22:31:15 +0000 UTC" firstStartedPulling="2026-04-24 22:31:16.110430034 +0000 UTC m=+96.054221345" lastFinishedPulling="2026-04-24 22:31:19.092218651 +0000 UTC m=+99.036009968" observedRunningTime="2026-04-24 22:31:20.073198792 +0000 UTC m=+100.016990122" watchObservedRunningTime="2026-04-24 22:31:20.07412169 +0000 UTC m=+100.017913022" Apr 24 22:31:21.053531 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:21.053483 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kq27h" event={"ID":"4a22403d-f328-4bbf-a4a4-63ec0876fc5a","Type":"ContainerStarted","Data":"cc90e81b516fa74d5986178d9d140416b08ca726deab02eb46dec048cb918c2f"} Apr 24 22:31:21.054109 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:21.053958 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kq27h" Apr 24 22:31:21.058792 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:21.058762 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kq27h" Apr 24 22:31:21.067795 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:21.067743 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kq27h" podStartSLOduration=1.406719468 podStartE2EDuration="3.067726011s" podCreationTimestamp="2026-04-24 22:31:18 +0000 UTC" firstStartedPulling="2026-04-24 22:31:18.832848165 +0000 UTC m=+98.776639474" lastFinishedPulling="2026-04-24 22:31:20.493854695 +0000 UTC m=+100.437646017" observedRunningTime="2026-04-24 22:31:21.066912725 +0000 UTC m=+101.010704053" watchObservedRunningTime="2026-04-24 22:31:21.067726011 +0000 UTC m=+101.011517342" Apr 24 22:31:26.060158 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:31:26.060130 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-77497b7776-fcp8g" Apr 24 22:33:52.681973 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:33:52.681931 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr"] Apr 24 22:33:52.685170 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:33:52.685152 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr" Apr 24 22:33:52.687132 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:33:52.687110 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 22:33:52.687272 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:33:52.687110 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 22:33:52.687432 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:33:52.687413 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wmmqt\"" Apr 24 22:33:52.692448 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:33:52.691953 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr"] Apr 24 22:33:52.756784 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:33:52.756743 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85c4bb1e-0ea2-4130-8e8d-ddc31619d97b-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr\" (UID: \"85c4bb1e-0ea2-4130-8e8d-ddc31619d97b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr" Apr 24 22:33:52.756784 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:33:52.756790 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85c4bb1e-0ea2-4130-8e8d-ddc31619d97b-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr\" (UID: \"85c4bb1e-0ea2-4130-8e8d-ddc31619d97b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr" Apr 24 22:33:52.756996 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:33:52.756817 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl29z\" (UniqueName: \"kubernetes.io/projected/85c4bb1e-0ea2-4130-8e8d-ddc31619d97b-kube-api-access-dl29z\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr\" (UID: \"85c4bb1e-0ea2-4130-8e8d-ddc31619d97b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr" Apr 24 22:33:52.858200 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:33:52.858159 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85c4bb1e-0ea2-4130-8e8d-ddc31619d97b-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr\" (UID: \"85c4bb1e-0ea2-4130-8e8d-ddc31619d97b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr" Apr 24 22:33:52.858200 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:33:52.858200 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85c4bb1e-0ea2-4130-8e8d-ddc31619d97b-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr\" (UID: \"85c4bb1e-0ea2-4130-8e8d-ddc31619d97b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr" Apr 24 22:33:52.858445 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:33:52.858226 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dl29z\" (UniqueName: \"kubernetes.io/projected/85c4bb1e-0ea2-4130-8e8d-ddc31619d97b-kube-api-access-dl29z\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr\" (UID: \"85c4bb1e-0ea2-4130-8e8d-ddc31619d97b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr" Apr 24 22:33:52.858569 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:33:52.858550 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85c4bb1e-0ea2-4130-8e8d-ddc31619d97b-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr\" (UID: \"85c4bb1e-0ea2-4130-8e8d-ddc31619d97b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr" Apr 24 22:33:52.858605 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:33:52.858590 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85c4bb1e-0ea2-4130-8e8d-ddc31619d97b-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr\" (UID: \"85c4bb1e-0ea2-4130-8e8d-ddc31619d97b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr" Apr 24 22:33:52.866281 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:33:52.866254 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl29z\" (UniqueName: \"kubernetes.io/projected/85c4bb1e-0ea2-4130-8e8d-ddc31619d97b-kube-api-access-dl29z\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr\" (UID: \"85c4bb1e-0ea2-4130-8e8d-ddc31619d97b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr" Apr 24 22:33:52.994740 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:33:52.994642 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr" Apr 24 22:33:53.114517 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:33:53.114475 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr"] Apr 24 22:33:53.117870 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:33:53.117827 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85c4bb1e_0ea2_4130_8e8d_ddc31619d97b.slice/crio-08975433a79c4ae1eac2b35c2b073aa3bfa26490255732e3395bb0ca14bc6c6f WatchSource:0}: Error finding container 08975433a79c4ae1eac2b35c2b073aa3bfa26490255732e3395bb0ca14bc6c6f: Status 404 returned error can't find the container with id 08975433a79c4ae1eac2b35c2b073aa3bfa26490255732e3395bb0ca14bc6c6f Apr 24 22:33:53.466538 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:33:53.466497 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr" event={"ID":"85c4bb1e-0ea2-4130-8e8d-ddc31619d97b","Type":"ContainerStarted","Data":"08975433a79c4ae1eac2b35c2b073aa3bfa26490255732e3395bb0ca14bc6c6f"} Apr 24 22:34:00.486938 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:00.486901 2582 generic.go:358] "Generic (PLEG): container finished" podID="85c4bb1e-0ea2-4130-8e8d-ddc31619d97b" containerID="c5ae254cdd3333fc4af3e9399bc80ed7b4b701998b5718690d38238c93eeda3e" exitCode=0 Apr 24 22:34:00.487339 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:00.486971 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr" event={"ID":"85c4bb1e-0ea2-4130-8e8d-ddc31619d97b","Type":"ContainerDied","Data":"c5ae254cdd3333fc4af3e9399bc80ed7b4b701998b5718690d38238c93eeda3e"} Apr 24 22:34:03.497051 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:03.497013 2582 generic.go:358] "Generic (PLEG): container finished" podID="85c4bb1e-0ea2-4130-8e8d-ddc31619d97b" containerID="9b64fa88ff58ffe432b28c648ec155988bafcb99179c92f43f48ca5efdfe5bad" exitCode=0 Apr 24 22:34:03.497438 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:03.497092 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr" event={"ID":"85c4bb1e-0ea2-4130-8e8d-ddc31619d97b","Type":"ContainerDied","Data":"9b64fa88ff58ffe432b28c648ec155988bafcb99179c92f43f48ca5efdfe5bad"} Apr 24 22:34:10.519232 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:10.519193 2582 generic.go:358] "Generic (PLEG): container finished" podID="85c4bb1e-0ea2-4130-8e8d-ddc31619d97b" containerID="b46a279c6c27dc30f071d512175e862926943f9cf896c123b114421b1b566bc6" exitCode=0 Apr 24 22:34:10.519598 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:10.519265 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr" event={"ID":"85c4bb1e-0ea2-4130-8e8d-ddc31619d97b","Type":"ContainerDied","Data":"b46a279c6c27dc30f071d512175e862926943f9cf896c123b114421b1b566bc6"} Apr 24 22:34:11.651288 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:11.651263 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr" Apr 24 22:34:11.813486 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:11.813434 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85c4bb1e-0ea2-4130-8e8d-ddc31619d97b-bundle\") pod \"85c4bb1e-0ea2-4130-8e8d-ddc31619d97b\" (UID: \"85c4bb1e-0ea2-4130-8e8d-ddc31619d97b\") " Apr 24 22:34:11.813623 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:11.813511 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl29z\" (UniqueName: \"kubernetes.io/projected/85c4bb1e-0ea2-4130-8e8d-ddc31619d97b-kube-api-access-dl29z\") pod \"85c4bb1e-0ea2-4130-8e8d-ddc31619d97b\" (UID: \"85c4bb1e-0ea2-4130-8e8d-ddc31619d97b\") " Apr 24 22:34:11.813623 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:11.813544 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85c4bb1e-0ea2-4130-8e8d-ddc31619d97b-util\") pod \"85c4bb1e-0ea2-4130-8e8d-ddc31619d97b\" (UID: \"85c4bb1e-0ea2-4130-8e8d-ddc31619d97b\") " Apr 24 22:34:11.814027 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:11.813991 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85c4bb1e-0ea2-4130-8e8d-ddc31619d97b-bundle" (OuterVolumeSpecName: "bundle") pod "85c4bb1e-0ea2-4130-8e8d-ddc31619d97b" (UID: "85c4bb1e-0ea2-4130-8e8d-ddc31619d97b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:34:11.815891 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:11.815856 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85c4bb1e-0ea2-4130-8e8d-ddc31619d97b-kube-api-access-dl29z" (OuterVolumeSpecName: "kube-api-access-dl29z") pod "85c4bb1e-0ea2-4130-8e8d-ddc31619d97b" (UID: "85c4bb1e-0ea2-4130-8e8d-ddc31619d97b"). InnerVolumeSpecName "kube-api-access-dl29z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:34:11.817566 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:11.817546 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85c4bb1e-0ea2-4130-8e8d-ddc31619d97b-util" (OuterVolumeSpecName: "util") pod "85c4bb1e-0ea2-4130-8e8d-ddc31619d97b" (UID: "85c4bb1e-0ea2-4130-8e8d-ddc31619d97b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:34:11.915000 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:11.914962 2582 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85c4bb1e-0ea2-4130-8e8d-ddc31619d97b-util\") on node \"ip-10-0-135-222.ec2.internal\" DevicePath \"\"" Apr 24 22:34:11.915000 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:11.914993 2582 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85c4bb1e-0ea2-4130-8e8d-ddc31619d97b-bundle\") on node \"ip-10-0-135-222.ec2.internal\" DevicePath \"\"" Apr 24 22:34:11.915000 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:11.915004 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dl29z\" (UniqueName: \"kubernetes.io/projected/85c4bb1e-0ea2-4130-8e8d-ddc31619d97b-kube-api-access-dl29z\") on node \"ip-10-0-135-222.ec2.internal\" DevicePath \"\"" Apr 24 22:34:12.526546 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:12.526509 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr" event={"ID":"85c4bb1e-0ea2-4130-8e8d-ddc31619d97b","Type":"ContainerDied","Data":"08975433a79c4ae1eac2b35c2b073aa3bfa26490255732e3395bb0ca14bc6c6f"} Apr 24 22:34:12.526546 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:12.526548 2582 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08975433a79c4ae1eac2b35c2b073aa3bfa26490255732e3395bb0ca14bc6c6f" Apr 24 22:34:12.526742 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:12.526552 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cswckr" Apr 24 22:34:14.323102 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:14.323049 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4l9fq"] Apr 24 22:34:14.323590 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:14.323381 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85c4bb1e-0ea2-4130-8e8d-ddc31619d97b" containerName="util" Apr 24 22:34:14.323590 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:14.323393 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="85c4bb1e-0ea2-4130-8e8d-ddc31619d97b" containerName="util" Apr 24 22:34:14.323590 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:14.323400 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85c4bb1e-0ea2-4130-8e8d-ddc31619d97b" containerName="pull" Apr 24 22:34:14.323590 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:14.323406 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="85c4bb1e-0ea2-4130-8e8d-ddc31619d97b" containerName="pull" Apr 24 22:34:14.323590 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:14.323414 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85c4bb1e-0ea2-4130-8e8d-ddc31619d97b" containerName="extract" Apr 24 22:34:14.323590 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:14.323420 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="85c4bb1e-0ea2-4130-8e8d-ddc31619d97b" containerName="extract" Apr 24 22:34:14.323590 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:14.323477 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="85c4bb1e-0ea2-4130-8e8d-ddc31619d97b" containerName="extract" Apr 24 22:34:14.325444 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:14.325427 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4l9fq" Apr 24 22:34:14.327287 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:14.327261 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 22:34:14.327444 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:14.327329 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 22:34:14.327444 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:14.327425 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 22:34:14.327864 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:14.327846 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-v9jwh\"" Apr 24 22:34:14.339254 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:14.339209 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4l9fq"] Apr 24 22:34:14.431730 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:14.431684 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdkf6\" (UniqueName: \"kubernetes.io/projected/ffc19f70-8782-4204-a308-d373ceb0919b-kube-api-access-hdkf6\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4l9fq\" (UID: \"ffc19f70-8782-4204-a308-d373ceb0919b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4l9fq" Apr 24 22:34:14.431908 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:14.431806 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/ffc19f70-8782-4204-a308-d373ceb0919b-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4l9fq\" (UID: \"ffc19f70-8782-4204-a308-d373ceb0919b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4l9fq" Apr 24 22:34:14.532379 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:14.532345 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/ffc19f70-8782-4204-a308-d373ceb0919b-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4l9fq\" (UID: \"ffc19f70-8782-4204-a308-d373ceb0919b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4l9fq" Apr 24 22:34:14.532545 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:14.532394 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hdkf6\" (UniqueName: \"kubernetes.io/projected/ffc19f70-8782-4204-a308-d373ceb0919b-kube-api-access-hdkf6\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4l9fq\" (UID: \"ffc19f70-8782-4204-a308-d373ceb0919b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4l9fq" Apr 24 22:34:14.534892 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:14.534858 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/ffc19f70-8782-4204-a308-d373ceb0919b-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4l9fq\" (UID: \"ffc19f70-8782-4204-a308-d373ceb0919b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4l9fq" Apr 24 22:34:14.540409 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:14.540382 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdkf6\" (UniqueName: \"kubernetes.io/projected/ffc19f70-8782-4204-a308-d373ceb0919b-kube-api-access-hdkf6\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4l9fq\" (UID: \"ffc19f70-8782-4204-a308-d373ceb0919b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4l9fq" Apr 24 22:34:14.634871 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:14.634785 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4l9fq" Apr 24 22:34:14.758752 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:14.758703 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4l9fq"] Apr 24 22:34:14.762711 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:34:14.762682 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffc19f70_8782_4204_a308_d373ceb0919b.slice/crio-a27236a889fdcf32167349582b70acadb3cbdc5153f3624ed306641417c04415 WatchSource:0}: Error finding container a27236a889fdcf32167349582b70acadb3cbdc5153f3624ed306641417c04415: Status 404 returned error can't find the container with id a27236a889fdcf32167349582b70acadb3cbdc5153f3624ed306641417c04415 Apr 24 22:34:15.535901 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:15.535861 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4l9fq" event={"ID":"ffc19f70-8782-4204-a308-d373ceb0919b","Type":"ContainerStarted","Data":"a27236a889fdcf32167349582b70acadb3cbdc5153f3624ed306641417c04415"} Apr 24 22:34:18.363322 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.363282 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-pm7hg"] Apr 24 22:34:18.365540 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.365512 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-pm7hg" Apr 24 22:34:18.367593 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.367573 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 22:34:18.367781 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.367766 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 22:34:18.367999 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.367982 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-k2k4q\"" Apr 24 22:34:18.374587 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.374563 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-pm7hg"] Apr 24 22:34:18.546514 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.546467 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4l9fq" event={"ID":"ffc19f70-8782-4204-a308-d373ceb0919b","Type":"ContainerStarted","Data":"025a4b0b6f5ba0ddcdd155297c62b633bf6ab94e412e4e6038283519b5460577"} Apr 24 22:34:18.546717 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.546687 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4l9fq" Apr 24 22:34:18.563126 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.563047 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4l9fq" podStartSLOduration=1.519603219 podStartE2EDuration="4.563029732s" podCreationTimestamp="2026-04-24 22:34:14 +0000 UTC" firstStartedPulling="2026-04-24 22:34:14.764508871 +0000 UTC m=+274.708300180" lastFinishedPulling="2026-04-24 22:34:17.807935381 +0000 UTC m=+277.751726693" observedRunningTime="2026-04-24 22:34:18.561357267 +0000 UTC m=+278.505148598" watchObservedRunningTime="2026-04-24 22:34:18.563029732 +0000 UTC m=+278.506821062" Apr 24 22:34:18.563670 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.563650 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h84jd\" (UniqueName: \"kubernetes.io/projected/c0353983-6f27-46a6-b73a-55346dbab7a8-kube-api-access-h84jd\") pod \"keda-operator-ffbb595cb-pm7hg\" (UID: \"c0353983-6f27-46a6-b73a-55346dbab7a8\") " pod="openshift-keda/keda-operator-ffbb595cb-pm7hg" Apr 24 22:34:18.563717 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.563681 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/c0353983-6f27-46a6-b73a-55346dbab7a8-cabundle0\") pod \"keda-operator-ffbb595cb-pm7hg\" (UID: \"c0353983-6f27-46a6-b73a-55346dbab7a8\") " pod="openshift-keda/keda-operator-ffbb595cb-pm7hg" Apr 24 22:34:18.563717 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.563714 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c0353983-6f27-46a6-b73a-55346dbab7a8-certificates\") pod \"keda-operator-ffbb595cb-pm7hg\" (UID: \"c0353983-6f27-46a6-b73a-55346dbab7a8\") " pod="openshift-keda/keda-operator-ffbb595cb-pm7hg" Apr 24 22:34:18.664569 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.664470 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h84jd\" (UniqueName: \"kubernetes.io/projected/c0353983-6f27-46a6-b73a-55346dbab7a8-kube-api-access-h84jd\") pod \"keda-operator-ffbb595cb-pm7hg\" (UID: \"c0353983-6f27-46a6-b73a-55346dbab7a8\") " pod="openshift-keda/keda-operator-ffbb595cb-pm7hg" Apr 24 22:34:18.664569 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.664512 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/c0353983-6f27-46a6-b73a-55346dbab7a8-cabundle0\") pod \"keda-operator-ffbb595cb-pm7hg\" (UID: \"c0353983-6f27-46a6-b73a-55346dbab7a8\") " pod="openshift-keda/keda-operator-ffbb595cb-pm7hg" Apr 24 22:34:18.664782 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.664681 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c0353983-6f27-46a6-b73a-55346dbab7a8-certificates\") pod \"keda-operator-ffbb595cb-pm7hg\" (UID: \"c0353983-6f27-46a6-b73a-55346dbab7a8\") " pod="openshift-keda/keda-operator-ffbb595cb-pm7hg" Apr 24 22:34:18.664837 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:18.664797 2582 secret.go:281] references non-existent secret key: ca.crt Apr 24 22:34:18.664837 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:18.664821 2582 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 22:34:18.664837 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:18.664834 2582 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-pm7hg: references non-existent secret key: ca.crt Apr 24 22:34:18.664989 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:18.664900 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0353983-6f27-46a6-b73a-55346dbab7a8-certificates podName:c0353983-6f27-46a6-b73a-55346dbab7a8 nodeName:}" failed. No retries permitted until 2026-04-24 22:34:19.164878983 +0000 UTC m=+279.108670295 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c0353983-6f27-46a6-b73a-55346dbab7a8-certificates") pod "keda-operator-ffbb595cb-pm7hg" (UID: "c0353983-6f27-46a6-b73a-55346dbab7a8") : references non-existent secret key: ca.crt Apr 24 22:34:18.665208 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.665185 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/c0353983-6f27-46a6-b73a-55346dbab7a8-cabundle0\") pod \"keda-operator-ffbb595cb-pm7hg\" (UID: \"c0353983-6f27-46a6-b73a-55346dbab7a8\") " pod="openshift-keda/keda-operator-ffbb595cb-pm7hg" Apr 24 22:34:18.672851 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.672823 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h84jd\" (UniqueName: \"kubernetes.io/projected/c0353983-6f27-46a6-b73a-55346dbab7a8-kube-api-access-h84jd\") pod \"keda-operator-ffbb595cb-pm7hg\" (UID: \"c0353983-6f27-46a6-b73a-55346dbab7a8\") " pod="openshift-keda/keda-operator-ffbb595cb-pm7hg" Apr 24 22:34:18.769242 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.769203 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft"] Apr 24 22:34:18.771402 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.771375 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft" Apr 24 22:34:18.773098 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.773051 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 22:34:18.779839 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.779810 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft"] Apr 24 22:34:18.867166 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.867123 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/587e5c8a-4963-48b8-b114-729b06ff1899-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-hhxft\" (UID: \"587e5c8a-4963-48b8-b114-729b06ff1899\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft" Apr 24 22:34:18.867348 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.867206 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/587e5c8a-4963-48b8-b114-729b06ff1899-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hhxft\" (UID: \"587e5c8a-4963-48b8-b114-729b06ff1899\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft" Apr 24 22:34:18.867348 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.867227 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfszv\" (UniqueName: \"kubernetes.io/projected/587e5c8a-4963-48b8-b114-729b06ff1899-kube-api-access-rfszv\") pod \"keda-metrics-apiserver-7c9f485588-hhxft\" (UID: \"587e5c8a-4963-48b8-b114-729b06ff1899\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft" Apr 24 22:34:18.967813 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.967728 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/587e5c8a-4963-48b8-b114-729b06ff1899-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-hhxft\" (UID: \"587e5c8a-4963-48b8-b114-729b06ff1899\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft" Apr 24 22:34:18.967960 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.967824 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/587e5c8a-4963-48b8-b114-729b06ff1899-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hhxft\" (UID: \"587e5c8a-4963-48b8-b114-729b06ff1899\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft" Apr 24 22:34:18.967960 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.967874 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rfszv\" (UniqueName: \"kubernetes.io/projected/587e5c8a-4963-48b8-b114-729b06ff1899-kube-api-access-rfszv\") pod \"keda-metrics-apiserver-7c9f485588-hhxft\" (UID: \"587e5c8a-4963-48b8-b114-729b06ff1899\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft" Apr 24 22:34:18.968083 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:18.967992 2582 secret.go:281] references non-existent secret key: tls.crt Apr 24 22:34:18.968083 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:18.968014 2582 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 22:34:18.968083 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:18.968038 2582 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft: references non-existent secret key: tls.crt Apr 24 22:34:18.968208 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:18.968130 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/587e5c8a-4963-48b8-b114-729b06ff1899-certificates podName:587e5c8a-4963-48b8-b114-729b06ff1899 nodeName:}" failed. No retries permitted until 2026-04-24 22:34:19.468107192 +0000 UTC m=+279.411898519 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/587e5c8a-4963-48b8-b114-729b06ff1899-certificates") pod "keda-metrics-apiserver-7c9f485588-hhxft" (UID: "587e5c8a-4963-48b8-b114-729b06ff1899") : references non-existent secret key: tls.crt Apr 24 22:34:18.968208 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.968151 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/587e5c8a-4963-48b8-b114-729b06ff1899-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-hhxft\" (UID: \"587e5c8a-4963-48b8-b114-729b06ff1899\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft" Apr 24 22:34:18.995581 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:18.995544 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfszv\" (UniqueName: \"kubernetes.io/projected/587e5c8a-4963-48b8-b114-729b06ff1899-kube-api-access-rfszv\") pod \"keda-metrics-apiserver-7c9f485588-hhxft\" (UID: \"587e5c8a-4963-48b8-b114-729b06ff1899\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft" Apr 24 22:34:19.041458 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:19.041416 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-nzqqw"] Apr 24 22:34:19.044169 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:19.044133 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-nzqqw" Apr 24 22:34:19.046147 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:19.046119 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 24 22:34:19.053941 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:19.053913 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-nzqqw"] Apr 24 22:34:19.068325 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:19.068286 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1c60981c-4310-4d4c-b7d0-b76bedf3f8bd-certificates\") pod \"keda-admission-cf49989db-nzqqw\" (UID: \"1c60981c-4310-4d4c-b7d0-b76bedf3f8bd\") " pod="openshift-keda/keda-admission-cf49989db-nzqqw" Apr 24 22:34:19.068499 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:19.068370 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5jhk\" (UniqueName: \"kubernetes.io/projected/1c60981c-4310-4d4c-b7d0-b76bedf3f8bd-kube-api-access-w5jhk\") pod \"keda-admission-cf49989db-nzqqw\" (UID: \"1c60981c-4310-4d4c-b7d0-b76bedf3f8bd\") " pod="openshift-keda/keda-admission-cf49989db-nzqqw" Apr 24 22:34:19.169658 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:19.169613 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1c60981c-4310-4d4c-b7d0-b76bedf3f8bd-certificates\") pod \"keda-admission-cf49989db-nzqqw\" (UID: \"1c60981c-4310-4d4c-b7d0-b76bedf3f8bd\") " pod="openshift-keda/keda-admission-cf49989db-nzqqw" Apr 24 22:34:19.169857 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:19.169672 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c0353983-6f27-46a6-b73a-55346dbab7a8-certificates\") pod \"keda-operator-ffbb595cb-pm7hg\" (UID: \"c0353983-6f27-46a6-b73a-55346dbab7a8\") " pod="openshift-keda/keda-operator-ffbb595cb-pm7hg" Apr 24 22:34:19.169857 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:19.169785 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5jhk\" (UniqueName: \"kubernetes.io/projected/1c60981c-4310-4d4c-b7d0-b76bedf3f8bd-kube-api-access-w5jhk\") pod \"keda-admission-cf49989db-nzqqw\" (UID: \"1c60981c-4310-4d4c-b7d0-b76bedf3f8bd\") " pod="openshift-keda/keda-admission-cf49989db-nzqqw" Apr 24 22:34:19.170263 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:19.170224 2582 secret.go:281] references non-existent secret key: ca.crt Apr 24 22:34:19.170263 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:19.170262 2582 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 22:34:19.170263 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:19.170277 2582 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-pm7hg: references non-existent secret key: ca.crt Apr 24 22:34:19.170504 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:19.170352 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0353983-6f27-46a6-b73a-55346dbab7a8-certificates podName:c0353983-6f27-46a6-b73a-55346dbab7a8 nodeName:}" failed. No retries permitted until 2026-04-24 22:34:20.170330458 +0000 UTC m=+280.114121772 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c0353983-6f27-46a6-b73a-55346dbab7a8-certificates") pod "keda-operator-ffbb595cb-pm7hg" (UID: "c0353983-6f27-46a6-b73a-55346dbab7a8") : references non-existent secret key: ca.crt Apr 24 22:34:19.175009 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:19.174237 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1c60981c-4310-4d4c-b7d0-b76bedf3f8bd-certificates\") pod \"keda-admission-cf49989db-nzqqw\" (UID: \"1c60981c-4310-4d4c-b7d0-b76bedf3f8bd\") " pod="openshift-keda/keda-admission-cf49989db-nzqqw" Apr 24 22:34:19.178374 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:19.178347 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5jhk\" (UniqueName: \"kubernetes.io/projected/1c60981c-4310-4d4c-b7d0-b76bedf3f8bd-kube-api-access-w5jhk\") pod \"keda-admission-cf49989db-nzqqw\" (UID: \"1c60981c-4310-4d4c-b7d0-b76bedf3f8bd\") " pod="openshift-keda/keda-admission-cf49989db-nzqqw" Apr 24 22:34:19.356710 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:19.356677 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-nzqqw" Apr 24 22:34:19.472600 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:19.472566 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/587e5c8a-4963-48b8-b114-729b06ff1899-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hhxft\" (UID: \"587e5c8a-4963-48b8-b114-729b06ff1899\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft" Apr 24 22:34:19.472933 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:19.472718 2582 secret.go:281] references non-existent secret key: tls.crt Apr 24 22:34:19.472933 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:19.472735 2582 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 22:34:19.472933 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:19.472755 2582 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft: references non-existent secret key: tls.crt Apr 24 22:34:19.472933 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:19.472816 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/587e5c8a-4963-48b8-b114-729b06ff1899-certificates podName:587e5c8a-4963-48b8-b114-729b06ff1899 nodeName:}" failed. No retries permitted until 2026-04-24 22:34:20.4728006 +0000 UTC m=+280.416591913 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/587e5c8a-4963-48b8-b114-729b06ff1899-certificates") pod "keda-metrics-apiserver-7c9f485588-hhxft" (UID: "587e5c8a-4963-48b8-b114-729b06ff1899") : references non-existent secret key: tls.crt Apr 24 22:34:19.497381 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:19.497344 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-nzqqw"] Apr 24 22:34:19.501480 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:34:19.501448 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c60981c_4310_4d4c_b7d0_b76bedf3f8bd.slice/crio-101f8ec5cb389df83c16f9e62c450137af9a6af370100235333c41c2f43c2612 WatchSource:0}: Error finding container 101f8ec5cb389df83c16f9e62c450137af9a6af370100235333c41c2f43c2612: Status 404 returned error can't find the container with id 101f8ec5cb389df83c16f9e62c450137af9a6af370100235333c41c2f43c2612 Apr 24 22:34:19.550526 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:19.550484 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-nzqqw" event={"ID":"1c60981c-4310-4d4c-b7d0-b76bedf3f8bd","Type":"ContainerStarted","Data":"101f8ec5cb389df83c16f9e62c450137af9a6af370100235333c41c2f43c2612"} Apr 24 22:34:20.178899 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:20.178862 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c0353983-6f27-46a6-b73a-55346dbab7a8-certificates\") pod \"keda-operator-ffbb595cb-pm7hg\" (UID: \"c0353983-6f27-46a6-b73a-55346dbab7a8\") " pod="openshift-keda/keda-operator-ffbb595cb-pm7hg" Apr 24 22:34:20.179090 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:20.179028 2582 secret.go:281] references non-existent secret key: ca.crt Apr 24 22:34:20.179090 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:20.179048 2582 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 22:34:20.179090 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:20.179075 2582 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-pm7hg: references non-existent secret key: ca.crt Apr 24 22:34:20.179239 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:20.179144 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0353983-6f27-46a6-b73a-55346dbab7a8-certificates podName:c0353983-6f27-46a6-b73a-55346dbab7a8 nodeName:}" failed. No retries permitted until 2026-04-24 22:34:22.179115278 +0000 UTC m=+282.122906594 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c0353983-6f27-46a6-b73a-55346dbab7a8-certificates") pod "keda-operator-ffbb595cb-pm7hg" (UID: "c0353983-6f27-46a6-b73a-55346dbab7a8") : references non-existent secret key: ca.crt Apr 24 22:34:20.481943 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:20.481851 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/587e5c8a-4963-48b8-b114-729b06ff1899-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hhxft\" (UID: \"587e5c8a-4963-48b8-b114-729b06ff1899\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft" Apr 24 22:34:20.482390 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:20.481994 2582 secret.go:281] references non-existent secret key: tls.crt Apr 24 22:34:20.482390 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:20.482009 2582 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 22:34:20.482390 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:20.482026 2582 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft: references non-existent secret key: tls.crt Apr 24 22:34:20.482390 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:20.482103 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/587e5c8a-4963-48b8-b114-729b06ff1899-certificates podName:587e5c8a-4963-48b8-b114-729b06ff1899 nodeName:}" failed. No retries permitted until 2026-04-24 22:34:22.482082771 +0000 UTC m=+282.425874081 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/587e5c8a-4963-48b8-b114-729b06ff1899-certificates") pod "keda-metrics-apiserver-7c9f485588-hhxft" (UID: "587e5c8a-4963-48b8-b114-729b06ff1899") : references non-existent secret key: tls.crt Apr 24 22:34:21.558169 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:21.558129 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-nzqqw" event={"ID":"1c60981c-4310-4d4c-b7d0-b76bedf3f8bd","Type":"ContainerStarted","Data":"79a39cb80d0a0ee55f8ff3a3d75df4ee658a49822d26d9e09562bb9450d5856e"} Apr 24 22:34:21.558545 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:21.558186 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-nzqqw" Apr 24 22:34:21.573196 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:21.573134 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-nzqqw" podStartSLOduration=1.364857412 podStartE2EDuration="2.573115389s" podCreationTimestamp="2026-04-24 22:34:19 +0000 UTC" firstStartedPulling="2026-04-24 22:34:19.502746802 +0000 UTC m=+279.446538110" lastFinishedPulling="2026-04-24 22:34:20.71100471 +0000 UTC m=+280.654796087" observedRunningTime="2026-04-24 22:34:21.571757467 +0000 UTC m=+281.515548798" watchObservedRunningTime="2026-04-24 22:34:21.573115389 +0000 UTC m=+281.516906721" Apr 24 22:34:22.193806 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:22.193751 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c0353983-6f27-46a6-b73a-55346dbab7a8-certificates\") pod \"keda-operator-ffbb595cb-pm7hg\" (UID: \"c0353983-6f27-46a6-b73a-55346dbab7a8\") " pod="openshift-keda/keda-operator-ffbb595cb-pm7hg" Apr 24 22:34:22.193977 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:22.193902 2582 secret.go:281] references non-existent secret key: ca.crt Apr 24 22:34:22.193977 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:22.193922 2582 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 22:34:22.193977 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:22.193931 2582 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-pm7hg: references non-existent secret key: ca.crt Apr 24 22:34:22.194140 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:22.193985 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0353983-6f27-46a6-b73a-55346dbab7a8-certificates podName:c0353983-6f27-46a6-b73a-55346dbab7a8 nodeName:}" failed. No retries permitted until 2026-04-24 22:34:26.193971434 +0000 UTC m=+286.137762744 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c0353983-6f27-46a6-b73a-55346dbab7a8-certificates") pod "keda-operator-ffbb595cb-pm7hg" (UID: "c0353983-6f27-46a6-b73a-55346dbab7a8") : references non-existent secret key: ca.crt Apr 24 22:34:22.496128 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:22.495996 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/587e5c8a-4963-48b8-b114-729b06ff1899-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hhxft\" (UID: \"587e5c8a-4963-48b8-b114-729b06ff1899\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft" Apr 24 22:34:22.496286 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:22.496158 2582 secret.go:281] references non-existent secret key: tls.crt Apr 24 22:34:22.496286 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:22.496182 2582 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 22:34:22.496286 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:22.496203 2582 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft: references non-existent secret key: tls.crt Apr 24 22:34:22.496286 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:34:22.496257 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/587e5c8a-4963-48b8-b114-729b06ff1899-certificates podName:587e5c8a-4963-48b8-b114-729b06ff1899 nodeName:}" failed. No retries permitted until 2026-04-24 22:34:26.496243233 +0000 UTC m=+286.440034542 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/587e5c8a-4963-48b8-b114-729b06ff1899-certificates") pod "keda-metrics-apiserver-7c9f485588-hhxft" (UID: "587e5c8a-4963-48b8-b114-729b06ff1899") : references non-existent secret key: tls.crt Apr 24 22:34:26.228687 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:26.228638 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c0353983-6f27-46a6-b73a-55346dbab7a8-certificates\") pod \"keda-operator-ffbb595cb-pm7hg\" (UID: \"c0353983-6f27-46a6-b73a-55346dbab7a8\") " pod="openshift-keda/keda-operator-ffbb595cb-pm7hg" Apr 24 22:34:26.231526 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:26.231497 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c0353983-6f27-46a6-b73a-55346dbab7a8-certificates\") pod \"keda-operator-ffbb595cb-pm7hg\" (UID: \"c0353983-6f27-46a6-b73a-55346dbab7a8\") " pod="openshift-keda/keda-operator-ffbb595cb-pm7hg" Apr 24 22:34:26.476524 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:26.476489 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-pm7hg" Apr 24 22:34:26.531620 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:26.531585 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/587e5c8a-4963-48b8-b114-729b06ff1899-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hhxft\" (UID: \"587e5c8a-4963-48b8-b114-729b06ff1899\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft" Apr 24 22:34:26.534563 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:26.534523 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/587e5c8a-4963-48b8-b114-729b06ff1899-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hhxft\" (UID: \"587e5c8a-4963-48b8-b114-729b06ff1899\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft" Apr 24 22:34:26.584259 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:26.584230 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft" Apr 24 22:34:26.599990 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:26.599963 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-pm7hg"] Apr 24 22:34:26.602258 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:34:26.602230 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0353983_6f27_46a6_b73a_55346dbab7a8.slice/crio-583cf8392b92adb9a8bbc322a685369c1e826d294f968b7e927645707c2da14e WatchSource:0}: Error finding container 583cf8392b92adb9a8bbc322a685369c1e826d294f968b7e927645707c2da14e: Status 404 returned error can't find the container with id 583cf8392b92adb9a8bbc322a685369c1e826d294f968b7e927645707c2da14e Apr 24 22:34:26.712001 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:26.711968 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft"] Apr 24 22:34:26.715187 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:34:26.715150 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod587e5c8a_4963_48b8_b114_729b06ff1899.slice/crio-cda42c708c855da46599c12f17154abea0a8cd9ee830020bb7c6663d36b78255 WatchSource:0}: Error finding container cda42c708c855da46599c12f17154abea0a8cd9ee830020bb7c6663d36b78255: Status 404 returned error can't find the container with id cda42c708c855da46599c12f17154abea0a8cd9ee830020bb7c6663d36b78255 Apr 24 22:34:27.577708 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:27.577658 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-pm7hg" event={"ID":"c0353983-6f27-46a6-b73a-55346dbab7a8","Type":"ContainerStarted","Data":"583cf8392b92adb9a8bbc322a685369c1e826d294f968b7e927645707c2da14e"} Apr 24 22:34:27.579556 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:27.579514 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft" event={"ID":"587e5c8a-4963-48b8-b114-729b06ff1899","Type":"ContainerStarted","Data":"cda42c708c855da46599c12f17154abea0a8cd9ee830020bb7c6663d36b78255"} Apr 24 22:34:30.589684 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:30.589642 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-pm7hg" event={"ID":"c0353983-6f27-46a6-b73a-55346dbab7a8","Type":"ContainerStarted","Data":"09df2254b0208ef95486e071d7ec9ed690dd0269114d9f828466f19755e2a4ad"} Apr 24 22:34:30.590170 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:30.589732 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-pm7hg" Apr 24 22:34:30.591093 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:30.591046 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft" event={"ID":"587e5c8a-4963-48b8-b114-729b06ff1899","Type":"ContainerStarted","Data":"f9228cbcd69642491f62c606465fd951375a9944889039d605f26e982bbfcd7a"} Apr 24 22:34:30.591207 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:30.591169 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft" Apr 24 22:34:30.604838 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:30.604761 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-pm7hg" podStartSLOduration=8.966244789 podStartE2EDuration="12.6047415s" podCreationTimestamp="2026-04-24 22:34:18 +0000 UTC" firstStartedPulling="2026-04-24 22:34:26.603361672 +0000 UTC m=+286.547152980" lastFinishedPulling="2026-04-24 22:34:30.241858381 +0000 UTC m=+290.185649691" observedRunningTime="2026-04-24 22:34:30.60452647 +0000 UTC m=+290.548317802" watchObservedRunningTime="2026-04-24 22:34:30.6047415 +0000 UTC m=+290.548532832" Apr 24 22:34:30.620559 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:30.620504 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft" podStartSLOduration=9.101793967 podStartE2EDuration="12.620488542s" podCreationTimestamp="2026-04-24 22:34:18 +0000 UTC" firstStartedPulling="2026-04-24 22:34:26.716487062 +0000 UTC m=+286.660278374" lastFinishedPulling="2026-04-24 22:34:30.235181626 +0000 UTC m=+290.178972949" observedRunningTime="2026-04-24 22:34:30.61899222 +0000 UTC m=+290.562783545" watchObservedRunningTime="2026-04-24 22:34:30.620488542 +0000 UTC m=+290.564279874" Apr 24 22:34:39.552906 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:39.552872 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4l9fq" Apr 24 22:34:40.529044 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:40.529018 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzsgr_239c26d8-bd64-4f99-9455-4fceceb609ee/ovn-acl-logging/0.log" Apr 24 22:34:40.529257 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:40.529018 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzsgr_239c26d8-bd64-4f99-9455-4fceceb609ee/ovn-acl-logging/0.log" Apr 24 22:34:40.532214 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:40.532192 2582 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 22:34:41.598691 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:41.598665 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhxft" Apr 24 22:34:42.563249 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:42.563211 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-nzqqw" Apr 24 22:34:51.596941 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:34:51.596911 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-pm7hg" Apr 24 22:35:27.455704 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.455666 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-pfjr5"] Apr 24 22:35:27.463686 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.463662 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-549bc44c6d-pfjr5" Apr 24 22:35:27.465714 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.465688 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 22:35:27.465864 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.465688 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 22:35:27.465864 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.465690 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 22:35:27.466046 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.466027 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-ppjsl\"" Apr 24 22:35:27.470951 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.469600 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-ttmtv"] Apr 24 22:35:27.474425 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.474397 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-pfjr5"] Apr 24 22:35:27.474549 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.474503 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ttmtv" Apr 24 22:35:27.476736 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.476712 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 22:35:27.476904 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.476879 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-v8wmd\"" Apr 24 22:35:27.485262 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.485240 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-ttmtv"] Apr 24 22:35:27.492837 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.492808 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-m2mvp"] Apr 24 22:35:27.495833 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.495811 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-m2mvp" Apr 24 22:35:27.497831 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.497808 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 22:35:27.497982 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.497962 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-vqhfc\"" Apr 24 22:35:27.506570 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.506527 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sqlt\" (UniqueName: \"kubernetes.io/projected/2875461c-b8e5-439a-8c86-283c6669427d-kube-api-access-2sqlt\") pod \"kserve-controller-manager-549bc44c6d-pfjr5\" (UID: \"2875461c-b8e5-439a-8c86-283c6669427d\") " pod="kserve/kserve-controller-manager-549bc44c6d-pfjr5" Apr 24 22:35:27.506713 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.506594 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2875461c-b8e5-439a-8c86-283c6669427d-cert\") pod \"kserve-controller-manager-549bc44c6d-pfjr5\" (UID: \"2875461c-b8e5-439a-8c86-283c6669427d\") " pod="kserve/kserve-controller-manager-549bc44c6d-pfjr5" Apr 24 22:35:27.510996 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.510965 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-m2mvp"] Apr 24 22:35:27.607259 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.607219 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2875461c-b8e5-439a-8c86-283c6669427d-cert\") pod \"kserve-controller-manager-549bc44c6d-pfjr5\" (UID: \"2875461c-b8e5-439a-8c86-283c6669427d\") " pod="kserve/kserve-controller-manager-549bc44c6d-pfjr5" Apr 24 22:35:27.607259 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.607260 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlmjx\" (UniqueName: \"kubernetes.io/projected/4cff1d5d-90c6-4f9b-8de2-4273e5a66ab7-kube-api-access-tlmjx\") pod \"seaweedfs-86cc847c5c-m2mvp\" (UID: \"4cff1d5d-90c6-4f9b-8de2-4273e5a66ab7\") " pod="kserve/seaweedfs-86cc847c5c-m2mvp" Apr 24 22:35:27.607481 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.607280 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4cff1d5d-90c6-4f9b-8de2-4273e5a66ab7-data\") pod \"seaweedfs-86cc847c5c-m2mvp\" (UID: \"4cff1d5d-90c6-4f9b-8de2-4273e5a66ab7\") " pod="kserve/seaweedfs-86cc847c5c-m2mvp" Apr 24 22:35:27.607481 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.607373 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e39acfc7-5c5e-4bdc-9800-c2740dcbb193-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-ttmtv\" (UID: \"e39acfc7-5c5e-4bdc-9800-c2740dcbb193\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ttmtv" Apr 24 22:35:27.607481 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.607394 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gktpp\" (UniqueName: \"kubernetes.io/projected/e39acfc7-5c5e-4bdc-9800-c2740dcbb193-kube-api-access-gktpp\") pod \"llmisvc-controller-manager-68cc5db7c4-ttmtv\" (UID: \"e39acfc7-5c5e-4bdc-9800-c2740dcbb193\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ttmtv" Apr 24 22:35:27.607481 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.607414 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sqlt\" (UniqueName: \"kubernetes.io/projected/2875461c-b8e5-439a-8c86-283c6669427d-kube-api-access-2sqlt\") pod \"kserve-controller-manager-549bc44c6d-pfjr5\" (UID: \"2875461c-b8e5-439a-8c86-283c6669427d\") " pod="kserve/kserve-controller-manager-549bc44c6d-pfjr5" Apr 24 22:35:27.609825 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.609804 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2875461c-b8e5-439a-8c86-283c6669427d-cert\") pod \"kserve-controller-manager-549bc44c6d-pfjr5\" (UID: \"2875461c-b8e5-439a-8c86-283c6669427d\") " pod="kserve/kserve-controller-manager-549bc44c6d-pfjr5" Apr 24 22:35:27.615950 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.615922 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sqlt\" (UniqueName: \"kubernetes.io/projected/2875461c-b8e5-439a-8c86-283c6669427d-kube-api-access-2sqlt\") pod \"kserve-controller-manager-549bc44c6d-pfjr5\" (UID: \"2875461c-b8e5-439a-8c86-283c6669427d\") " pod="kserve/kserve-controller-manager-549bc44c6d-pfjr5" Apr 24 22:35:27.708110 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.707973 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e39acfc7-5c5e-4bdc-9800-c2740dcbb193-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-ttmtv\" (UID: \"e39acfc7-5c5e-4bdc-9800-c2740dcbb193\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ttmtv" Apr 24 22:35:27.708110 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.708031 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gktpp\" (UniqueName: \"kubernetes.io/projected/e39acfc7-5c5e-4bdc-9800-c2740dcbb193-kube-api-access-gktpp\") pod \"llmisvc-controller-manager-68cc5db7c4-ttmtv\" (UID: \"e39acfc7-5c5e-4bdc-9800-c2740dcbb193\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ttmtv" Apr 24 22:35:27.708339 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.708205 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlmjx\" (UniqueName: \"kubernetes.io/projected/4cff1d5d-90c6-4f9b-8de2-4273e5a66ab7-kube-api-access-tlmjx\") pod \"seaweedfs-86cc847c5c-m2mvp\" (UID: \"4cff1d5d-90c6-4f9b-8de2-4273e5a66ab7\") " pod="kserve/seaweedfs-86cc847c5c-m2mvp" Apr 24 22:35:27.708339 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.708245 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4cff1d5d-90c6-4f9b-8de2-4273e5a66ab7-data\") pod \"seaweedfs-86cc847c5c-m2mvp\" (UID: \"4cff1d5d-90c6-4f9b-8de2-4273e5a66ab7\") " pod="kserve/seaweedfs-86cc847c5c-m2mvp" Apr 24 22:35:27.708611 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.708593 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4cff1d5d-90c6-4f9b-8de2-4273e5a66ab7-data\") pod \"seaweedfs-86cc847c5c-m2mvp\" (UID: \"4cff1d5d-90c6-4f9b-8de2-4273e5a66ab7\") " pod="kserve/seaweedfs-86cc847c5c-m2mvp" Apr 24 22:35:27.710593 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.710575 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e39acfc7-5c5e-4bdc-9800-c2740dcbb193-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-ttmtv\" (UID: \"e39acfc7-5c5e-4bdc-9800-c2740dcbb193\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ttmtv" Apr 24 22:35:27.716302 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.716272 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gktpp\" (UniqueName: \"kubernetes.io/projected/e39acfc7-5c5e-4bdc-9800-c2740dcbb193-kube-api-access-gktpp\") pod \"llmisvc-controller-manager-68cc5db7c4-ttmtv\" (UID: \"e39acfc7-5c5e-4bdc-9800-c2740dcbb193\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ttmtv" Apr 24 22:35:27.716434 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.716412 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlmjx\" (UniqueName: \"kubernetes.io/projected/4cff1d5d-90c6-4f9b-8de2-4273e5a66ab7-kube-api-access-tlmjx\") pod \"seaweedfs-86cc847c5c-m2mvp\" (UID: \"4cff1d5d-90c6-4f9b-8de2-4273e5a66ab7\") " pod="kserve/seaweedfs-86cc847c5c-m2mvp" Apr 24 22:35:27.777345 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.777261 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-549bc44c6d-pfjr5" Apr 24 22:35:27.787175 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.787136 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ttmtv" Apr 24 22:35:27.805958 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.805926 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-m2mvp" Apr 24 22:35:27.924887 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.924852 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-pfjr5"] Apr 24 22:35:27.929251 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:35:27.929205 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2875461c_b8e5_439a_8c86_283c6669427d.slice/crio-999a5575170f819457165ec772a2fb0fe9f5f31dce942744728a1b239f6c4a66 WatchSource:0}: Error finding container 999a5575170f819457165ec772a2fb0fe9f5f31dce942744728a1b239f6c4a66: Status 404 returned error can't find the container with id 999a5575170f819457165ec772a2fb0fe9f5f31dce942744728a1b239f6c4a66 Apr 24 22:35:27.930800 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.930779 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:35:27.948713 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.948676 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-ttmtv"] Apr 24 22:35:27.953369 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:35:27.953332 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode39acfc7_5c5e_4bdc_9800_c2740dcbb193.slice/crio-269ab19ea0b88a6c05a21c00d6aa52f4d5b76cd4346c900583a0536d85fbb7fb WatchSource:0}: Error finding container 269ab19ea0b88a6c05a21c00d6aa52f4d5b76cd4346c900583a0536d85fbb7fb: Status 404 returned error can't find the container with id 269ab19ea0b88a6c05a21c00d6aa52f4d5b76cd4346c900583a0536d85fbb7fb Apr 24 22:35:27.972658 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:27.972631 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-m2mvp"] Apr 24 22:35:27.976493 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:35:27.976462 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cff1d5d_90c6_4f9b_8de2_4273e5a66ab7.slice/crio-8d5867dcefe045549a0a3a92c6452ce332d12d475fa0bee8c9f6ca8afd9ef466 WatchSource:0}: Error finding container 8d5867dcefe045549a0a3a92c6452ce332d12d475fa0bee8c9f6ca8afd9ef466: Status 404 returned error can't find the container with id 8d5867dcefe045549a0a3a92c6452ce332d12d475fa0bee8c9f6ca8afd9ef466 Apr 24 22:35:28.758795 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:28.758740 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ttmtv" event={"ID":"e39acfc7-5c5e-4bdc-9800-c2740dcbb193","Type":"ContainerStarted","Data":"269ab19ea0b88a6c05a21c00d6aa52f4d5b76cd4346c900583a0536d85fbb7fb"} Apr 24 22:35:28.760010 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:28.759960 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-549bc44c6d-pfjr5" event={"ID":"2875461c-b8e5-439a-8c86-283c6669427d","Type":"ContainerStarted","Data":"999a5575170f819457165ec772a2fb0fe9f5f31dce942744728a1b239f6c4a66"} Apr 24 22:35:28.761260 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:28.761228 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-m2mvp" event={"ID":"4cff1d5d-90c6-4f9b-8de2-4273e5a66ab7","Type":"ContainerStarted","Data":"8d5867dcefe045549a0a3a92c6452ce332d12d475fa0bee8c9f6ca8afd9ef466"} Apr 24 22:35:33.785225 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:33.785177 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ttmtv" event={"ID":"e39acfc7-5c5e-4bdc-9800-c2740dcbb193","Type":"ContainerStarted","Data":"ffd14d4fbb2e07efd22a48f78326480f93b018d20123209757ba5f347d552a37"} Apr 24 22:35:33.785716 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:33.785411 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ttmtv" Apr 24 22:35:33.786634 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:33.786607 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-549bc44c6d-pfjr5" event={"ID":"2875461c-b8e5-439a-8c86-283c6669427d","Type":"ContainerStarted","Data":"4305004eaec7d5eb3262884b2f8323b05e1865df69894a63911ccb06d2cb55f2"} Apr 24 22:35:33.786759 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:33.786700 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-549bc44c6d-pfjr5" Apr 24 22:35:33.787903 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:33.787883 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-m2mvp" event={"ID":"4cff1d5d-90c6-4f9b-8de2-4273e5a66ab7","Type":"ContainerStarted","Data":"da66a189b6a78c6e8d93a423df413901488756104b1c38ad17edda12b2173685"} Apr 24 22:35:33.788010 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:33.787994 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-m2mvp" Apr 24 22:35:33.802182 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:33.802128 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ttmtv" podStartSLOduration=2.035379939 podStartE2EDuration="6.80211139s" podCreationTimestamp="2026-04-24 22:35:27 +0000 UTC" firstStartedPulling="2026-04-24 22:35:27.955125183 +0000 UTC m=+347.898916493" lastFinishedPulling="2026-04-24 22:35:32.721856622 +0000 UTC m=+352.665647944" observedRunningTime="2026-04-24 22:35:33.800483909 +0000 UTC m=+353.744275240" watchObservedRunningTime="2026-04-24 22:35:33.80211139 +0000 UTC m=+353.745902722" Apr 24 22:35:33.815247 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:33.815191 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-549bc44c6d-pfjr5" podStartSLOduration=2.024393077 podStartE2EDuration="6.815177533s" podCreationTimestamp="2026-04-24 22:35:27 +0000 UTC" firstStartedPulling="2026-04-24 22:35:27.930951278 +0000 UTC m=+347.874742598" lastFinishedPulling="2026-04-24 22:35:32.721735727 +0000 UTC m=+352.665527054" observedRunningTime="2026-04-24 22:35:33.814807474 +0000 UTC m=+353.758598805" watchObservedRunningTime="2026-04-24 22:35:33.815177533 +0000 UTC m=+353.758968864" Apr 24 22:35:33.828471 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:33.828423 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-m2mvp" podStartSLOduration=2.028186232 podStartE2EDuration="6.828407411s" podCreationTimestamp="2026-04-24 22:35:27 +0000 UTC" firstStartedPulling="2026-04-24 22:35:27.977861261 +0000 UTC m=+347.921652570" lastFinishedPulling="2026-04-24 22:35:32.778082439 +0000 UTC m=+352.721873749" observedRunningTime="2026-04-24 22:35:33.828019411 +0000 UTC m=+353.771810747" watchObservedRunningTime="2026-04-24 22:35:33.828407411 +0000 UTC m=+353.772198742" Apr 24 22:35:39.793592 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:35:39.793553 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-m2mvp" Apr 24 22:36:04.794153 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:04.794122 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ttmtv" Apr 24 22:36:04.797236 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:04.797212 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-549bc44c6d-pfjr5" Apr 24 22:36:06.261611 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.261574 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-pfjr5"] Apr 24 22:36:06.261992 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.261794 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-549bc44c6d-pfjr5" podUID="2875461c-b8e5-439a-8c86-283c6669427d" containerName="manager" containerID="cri-o://4305004eaec7d5eb3262884b2f8323b05e1865df69894a63911ccb06d2cb55f2" gracePeriod=10 Apr 24 22:36:06.284387 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.284358 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-l85mp"] Apr 24 22:36:06.359206 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.359179 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-l85mp"] Apr 24 22:36:06.359318 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.359307 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-549bc44c6d-l85mp" Apr 24 22:36:06.407534 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.407500 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhg2z\" (UniqueName: \"kubernetes.io/projected/b2796c7a-f28e-4c0a-bdee-01d872204bc0-kube-api-access-dhg2z\") pod \"kserve-controller-manager-549bc44c6d-l85mp\" (UID: \"b2796c7a-f28e-4c0a-bdee-01d872204bc0\") " pod="kserve/kserve-controller-manager-549bc44c6d-l85mp" Apr 24 22:36:06.407720 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.407559 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2796c7a-f28e-4c0a-bdee-01d872204bc0-cert\") pod \"kserve-controller-manager-549bc44c6d-l85mp\" (UID: \"b2796c7a-f28e-4c0a-bdee-01d872204bc0\") " pod="kserve/kserve-controller-manager-549bc44c6d-l85mp" Apr 24 22:36:06.508754 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.508719 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhg2z\" (UniqueName: \"kubernetes.io/projected/b2796c7a-f28e-4c0a-bdee-01d872204bc0-kube-api-access-dhg2z\") pod \"kserve-controller-manager-549bc44c6d-l85mp\" (UID: \"b2796c7a-f28e-4c0a-bdee-01d872204bc0\") " pod="kserve/kserve-controller-manager-549bc44c6d-l85mp" Apr 24 22:36:06.508945 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.508799 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2796c7a-f28e-4c0a-bdee-01d872204bc0-cert\") pod \"kserve-controller-manager-549bc44c6d-l85mp\" (UID: \"b2796c7a-f28e-4c0a-bdee-01d872204bc0\") " pod="kserve/kserve-controller-manager-549bc44c6d-l85mp" Apr 24 22:36:06.511406 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.511383 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2796c7a-f28e-4c0a-bdee-01d872204bc0-cert\") pod \"kserve-controller-manager-549bc44c6d-l85mp\" (UID: \"b2796c7a-f28e-4c0a-bdee-01d872204bc0\") " pod="kserve/kserve-controller-manager-549bc44c6d-l85mp" Apr 24 22:36:06.526210 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.526181 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-549bc44c6d-pfjr5" Apr 24 22:36:06.526347 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.526217 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhg2z\" (UniqueName: \"kubernetes.io/projected/b2796c7a-f28e-4c0a-bdee-01d872204bc0-kube-api-access-dhg2z\") pod \"kserve-controller-manager-549bc44c6d-l85mp\" (UID: \"b2796c7a-f28e-4c0a-bdee-01d872204bc0\") " pod="kserve/kserve-controller-manager-549bc44c6d-l85mp" Apr 24 22:36:06.609443 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.609408 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2875461c-b8e5-439a-8c86-283c6669427d-cert\") pod \"2875461c-b8e5-439a-8c86-283c6669427d\" (UID: \"2875461c-b8e5-439a-8c86-283c6669427d\") " Apr 24 22:36:06.609632 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.609467 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sqlt\" (UniqueName: \"kubernetes.io/projected/2875461c-b8e5-439a-8c86-283c6669427d-kube-api-access-2sqlt\") pod \"2875461c-b8e5-439a-8c86-283c6669427d\" (UID: \"2875461c-b8e5-439a-8c86-283c6669427d\") " Apr 24 22:36:06.611834 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.611806 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2875461c-b8e5-439a-8c86-283c6669427d-cert" (OuterVolumeSpecName: "cert") pod "2875461c-b8e5-439a-8c86-283c6669427d" (UID: "2875461c-b8e5-439a-8c86-283c6669427d"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:36:06.611944 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.611830 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2875461c-b8e5-439a-8c86-283c6669427d-kube-api-access-2sqlt" (OuterVolumeSpecName: "kube-api-access-2sqlt") pod "2875461c-b8e5-439a-8c86-283c6669427d" (UID: "2875461c-b8e5-439a-8c86-283c6669427d"). InnerVolumeSpecName "kube-api-access-2sqlt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:36:06.710544 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.710508 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2sqlt\" (UniqueName: \"kubernetes.io/projected/2875461c-b8e5-439a-8c86-283c6669427d-kube-api-access-2sqlt\") on node \"ip-10-0-135-222.ec2.internal\" DevicePath \"\"" Apr 24 22:36:06.710544 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.710541 2582 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2875461c-b8e5-439a-8c86-283c6669427d-cert\") on node \"ip-10-0-135-222.ec2.internal\" DevicePath \"\"" Apr 24 22:36:06.720074 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.720032 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-549bc44c6d-l85mp" Apr 24 22:36:06.844433 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.844399 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-l85mp"] Apr 24 22:36:06.847746 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:36:06.847702 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2796c7a_f28e_4c0a_bdee_01d872204bc0.slice/crio-26b1bc495f26726280145c46b02dab07710e1c2f4e3bdc80141e4df64c6784d3 WatchSource:0}: Error finding container 26b1bc495f26726280145c46b02dab07710e1c2f4e3bdc80141e4df64c6784d3: Status 404 returned error can't find the container with id 26b1bc495f26726280145c46b02dab07710e1c2f4e3bdc80141e4df64c6784d3 Apr 24 22:36:06.892138 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.892093 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-549bc44c6d-l85mp" event={"ID":"b2796c7a-f28e-4c0a-bdee-01d872204bc0","Type":"ContainerStarted","Data":"26b1bc495f26726280145c46b02dab07710e1c2f4e3bdc80141e4df64c6784d3"} Apr 24 22:36:06.893204 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.893178 2582 generic.go:358] "Generic (PLEG): container finished" podID="2875461c-b8e5-439a-8c86-283c6669427d" containerID="4305004eaec7d5eb3262884b2f8323b05e1865df69894a63911ccb06d2cb55f2" exitCode=0 Apr 24 22:36:06.893304 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.893222 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-549bc44c6d-pfjr5" event={"ID":"2875461c-b8e5-439a-8c86-283c6669427d","Type":"ContainerDied","Data":"4305004eaec7d5eb3262884b2f8323b05e1865df69894a63911ccb06d2cb55f2"} Apr 24 22:36:06.893304 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.893244 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-549bc44c6d-pfjr5" event={"ID":"2875461c-b8e5-439a-8c86-283c6669427d","Type":"ContainerDied","Data":"999a5575170f819457165ec772a2fb0fe9f5f31dce942744728a1b239f6c4a66"} Apr 24 22:36:06.893304 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.893243 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-549bc44c6d-pfjr5" Apr 24 22:36:06.893304 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.893256 2582 scope.go:117] "RemoveContainer" containerID="4305004eaec7d5eb3262884b2f8323b05e1865df69894a63911ccb06d2cb55f2" Apr 24 22:36:06.901148 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.901130 2582 scope.go:117] "RemoveContainer" containerID="4305004eaec7d5eb3262884b2f8323b05e1865df69894a63911ccb06d2cb55f2" Apr 24 22:36:06.901415 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:36:06.901393 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4305004eaec7d5eb3262884b2f8323b05e1865df69894a63911ccb06d2cb55f2\": container with ID starting with 4305004eaec7d5eb3262884b2f8323b05e1865df69894a63911ccb06d2cb55f2 not found: ID does not exist" containerID="4305004eaec7d5eb3262884b2f8323b05e1865df69894a63911ccb06d2cb55f2" Apr 24 22:36:06.901493 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.901421 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4305004eaec7d5eb3262884b2f8323b05e1865df69894a63911ccb06d2cb55f2"} err="failed to get container status \"4305004eaec7d5eb3262884b2f8323b05e1865df69894a63911ccb06d2cb55f2\": rpc error: code = NotFound desc = could not find container \"4305004eaec7d5eb3262884b2f8323b05e1865df69894a63911ccb06d2cb55f2\": container with ID starting with 4305004eaec7d5eb3262884b2f8323b05e1865df69894a63911ccb06d2cb55f2 not found: ID does not exist" Apr 24 22:36:06.909600 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.909568 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-pfjr5"] Apr 24 22:36:06.914245 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:06.914223 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-pfjr5"] Apr 24 22:36:07.898476 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:07.898437 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-549bc44c6d-l85mp" event={"ID":"b2796c7a-f28e-4c0a-bdee-01d872204bc0","Type":"ContainerStarted","Data":"1f09a20dbf760bb7ec303e473a0604e69f593b5c38cac305fa086662c759c7dc"} Apr 24 22:36:07.898961 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:07.898533 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-549bc44c6d-l85mp" Apr 24 22:36:07.914152 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:07.914105 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-549bc44c6d-l85mp" podStartSLOduration=1.611466316 podStartE2EDuration="1.914087848s" podCreationTimestamp="2026-04-24 22:36:06 +0000 UTC" firstStartedPulling="2026-04-24 22:36:06.848990537 +0000 UTC m=+386.792781847" lastFinishedPulling="2026-04-24 22:36:07.151612064 +0000 UTC m=+387.095403379" observedRunningTime="2026-04-24 22:36:07.913265795 +0000 UTC m=+387.857057125" watchObservedRunningTime="2026-04-24 22:36:07.914087848 +0000 UTC m=+387.857879178" Apr 24 22:36:08.636915 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:08.636874 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2875461c-b8e5-439a-8c86-283c6669427d" path="/var/lib/kubelet/pods/2875461c-b8e5-439a-8c86-283c6669427d/volumes" Apr 24 22:36:38.907538 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:38.907500 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-549bc44c6d-l85mp" Apr 24 22:36:40.017904 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:40.017865 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-c6rnt"] Apr 24 22:36:40.018298 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:40.018206 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2875461c-b8e5-439a-8c86-283c6669427d" containerName="manager" Apr 24 22:36:40.018298 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:40.018217 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="2875461c-b8e5-439a-8c86-283c6669427d" containerName="manager" Apr 24 22:36:40.018298 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:40.018266 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="2875461c-b8e5-439a-8c86-283c6669427d" containerName="manager" Apr 24 22:36:40.021526 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:40.021509 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-c6rnt" Apr 24 22:36:40.023598 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:40.023576 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 24 22:36:40.023598 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:40.023592 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-8f7nt\"" Apr 24 22:36:40.030990 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:40.030961 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-c6rnt"] Apr 24 22:36:40.166625 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:40.166585 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7q6z\" (UniqueName: \"kubernetes.io/projected/dabe2de2-793c-48e3-90f8-b49ff5ec1fa9-kube-api-access-r7q6z\") pod \"odh-model-controller-696fc77849-c6rnt\" (UID: \"dabe2de2-793c-48e3-90f8-b49ff5ec1fa9\") " pod="kserve/odh-model-controller-696fc77849-c6rnt" Apr 24 22:36:40.166800 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:40.166639 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dabe2de2-793c-48e3-90f8-b49ff5ec1fa9-cert\") pod \"odh-model-controller-696fc77849-c6rnt\" (UID: \"dabe2de2-793c-48e3-90f8-b49ff5ec1fa9\") " pod="kserve/odh-model-controller-696fc77849-c6rnt" Apr 24 22:36:40.267762 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:40.267697 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7q6z\" (UniqueName: \"kubernetes.io/projected/dabe2de2-793c-48e3-90f8-b49ff5ec1fa9-kube-api-access-r7q6z\") pod \"odh-model-controller-696fc77849-c6rnt\" (UID: \"dabe2de2-793c-48e3-90f8-b49ff5ec1fa9\") " pod="kserve/odh-model-controller-696fc77849-c6rnt" Apr 24 22:36:40.267960 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:40.267795 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dabe2de2-793c-48e3-90f8-b49ff5ec1fa9-cert\") pod \"odh-model-controller-696fc77849-c6rnt\" (UID: \"dabe2de2-793c-48e3-90f8-b49ff5ec1fa9\") " pod="kserve/odh-model-controller-696fc77849-c6rnt" Apr 24 22:36:40.270736 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:40.270701 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dabe2de2-793c-48e3-90f8-b49ff5ec1fa9-cert\") pod \"odh-model-controller-696fc77849-c6rnt\" (UID: \"dabe2de2-793c-48e3-90f8-b49ff5ec1fa9\") " pod="kserve/odh-model-controller-696fc77849-c6rnt" Apr 24 22:36:40.275646 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:40.275606 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7q6z\" (UniqueName: \"kubernetes.io/projected/dabe2de2-793c-48e3-90f8-b49ff5ec1fa9-kube-api-access-r7q6z\") pod \"odh-model-controller-696fc77849-c6rnt\" (UID: \"dabe2de2-793c-48e3-90f8-b49ff5ec1fa9\") " pod="kserve/odh-model-controller-696fc77849-c6rnt" Apr 24 22:36:40.335158 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:40.335104 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-c6rnt" Apr 24 22:36:40.498857 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:40.498824 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-c6rnt"] Apr 24 22:36:40.502324 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:36:40.502295 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddabe2de2_793c_48e3_90f8_b49ff5ec1fa9.slice/crio-f8a5cc7c465584c5befa50cd4165f0bdd4fe0bc69caeb5a9e25e395606fbf9c4 WatchSource:0}: Error finding container f8a5cc7c465584c5befa50cd4165f0bdd4fe0bc69caeb5a9e25e395606fbf9c4: Status 404 returned error can't find the container with id f8a5cc7c465584c5befa50cd4165f0bdd4fe0bc69caeb5a9e25e395606fbf9c4 Apr 24 22:36:41.008081 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:41.007955 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-c6rnt" event={"ID":"dabe2de2-793c-48e3-90f8-b49ff5ec1fa9","Type":"ContainerStarted","Data":"f8a5cc7c465584c5befa50cd4165f0bdd4fe0bc69caeb5a9e25e395606fbf9c4"} Apr 24 22:36:43.016096 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:43.016031 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-c6rnt" event={"ID":"dabe2de2-793c-48e3-90f8-b49ff5ec1fa9","Type":"ContainerStarted","Data":"2b9e5126228bd106ef56ce0d9cb896b94650ac62a7b4fe4d857673a63c9e18aa"} Apr 24 22:36:43.016466 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:43.016223 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-c6rnt" Apr 24 22:36:43.031647 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:43.031587 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-c6rnt" podStartSLOduration=1.632650551 podStartE2EDuration="4.031569071s" podCreationTimestamp="2026-04-24 22:36:39 +0000 UTC" firstStartedPulling="2026-04-24 22:36:40.503687044 +0000 UTC m=+420.447478356" lastFinishedPulling="2026-04-24 22:36:42.90260555 +0000 UTC m=+422.846396876" observedRunningTime="2026-04-24 22:36:43.030051912 +0000 UTC m=+422.973843242" watchObservedRunningTime="2026-04-24 22:36:43.031569071 +0000 UTC m=+422.975360425" Apr 24 22:36:54.021302 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:36:54.021267 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-c6rnt" Apr 24 22:39:40.549236 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:39:40.549198 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzsgr_239c26d8-bd64-4f99-9455-4fceceb609ee/ovn-acl-logging/0.log" Apr 24 22:39:40.551605 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:39:40.551584 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzsgr_239c26d8-bd64-4f99-9455-4fceceb609ee/ovn-acl-logging/0.log" Apr 24 22:40:19.048910 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:19.048869 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg"] Apr 24 22:40:19.051933 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:19.051911 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" Apr 24 22:40:19.053914 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:19.053891 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-gtqzv\"" Apr 24 22:40:19.054333 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:19.054309 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-d4694-kube-rbac-proxy-sar-config\"" Apr 24 22:40:19.054455 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:19.054435 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-d4694-serving-cert\"" Apr 24 22:40:19.054589 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:19.054572 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 22:40:19.066166 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:19.066127 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg"] Apr 24 22:40:19.110479 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:19.110444 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3cb48c49-1c88-4662-a812-ceca96ed4fc7-proxy-tls\") pod \"model-chainer-raw-d4694-69669455cb-t9wzg\" (UID: \"3cb48c49-1c88-4662-a812-ceca96ed4fc7\") " pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" Apr 24 22:40:19.110644 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:19.110499 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cb48c49-1c88-4662-a812-ceca96ed4fc7-openshift-service-ca-bundle\") pod \"model-chainer-raw-d4694-69669455cb-t9wzg\" (UID: \"3cb48c49-1c88-4662-a812-ceca96ed4fc7\") " pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" Apr 24 22:40:19.211484 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:19.211441 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3cb48c49-1c88-4662-a812-ceca96ed4fc7-proxy-tls\") pod \"model-chainer-raw-d4694-69669455cb-t9wzg\" (UID: \"3cb48c49-1c88-4662-a812-ceca96ed4fc7\") " pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" Apr 24 22:40:19.211660 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:19.211524 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cb48c49-1c88-4662-a812-ceca96ed4fc7-openshift-service-ca-bundle\") pod \"model-chainer-raw-d4694-69669455cb-t9wzg\" (UID: \"3cb48c49-1c88-4662-a812-ceca96ed4fc7\") " pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" Apr 24 22:40:19.212183 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:19.212139 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cb48c49-1c88-4662-a812-ceca96ed4fc7-openshift-service-ca-bundle\") pod \"model-chainer-raw-d4694-69669455cb-t9wzg\" (UID: \"3cb48c49-1c88-4662-a812-ceca96ed4fc7\") " pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" Apr 24 22:40:19.213944 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:19.213921 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3cb48c49-1c88-4662-a812-ceca96ed4fc7-proxy-tls\") pod \"model-chainer-raw-d4694-69669455cb-t9wzg\" (UID: \"3cb48c49-1c88-4662-a812-ceca96ed4fc7\") " pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" Apr 24 22:40:19.362453 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:19.362370 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" Apr 24 22:40:19.483384 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:19.483358 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg"] Apr 24 22:40:19.485997 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:40:19.485967 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cb48c49_1c88_4662_a812_ceca96ed4fc7.slice/crio-1fd50ffe818e4b7b3638c7e979372b4126eeba9e8fac70bf3eb7f325427f5484 WatchSource:0}: Error finding container 1fd50ffe818e4b7b3638c7e979372b4126eeba9e8fac70bf3eb7f325427f5484: Status 404 returned error can't find the container with id 1fd50ffe818e4b7b3638c7e979372b4126eeba9e8fac70bf3eb7f325427f5484 Apr 24 22:40:19.715612 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:19.715518 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" event={"ID":"3cb48c49-1c88-4662-a812-ceca96ed4fc7","Type":"ContainerStarted","Data":"1fd50ffe818e4b7b3638c7e979372b4126eeba9e8fac70bf3eb7f325427f5484"} Apr 24 22:40:21.727915 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:21.727876 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" event={"ID":"3cb48c49-1c88-4662-a812-ceca96ed4fc7","Type":"ContainerStarted","Data":"d209244221ab982960f6458623d85c5c08734654628733f028c45996877009fe"} Apr 24 22:40:21.728339 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:21.728104 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" Apr 24 22:40:21.742417 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:21.742361 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" podStartSLOduration=1.092604607 podStartE2EDuration="2.742346148s" podCreationTimestamp="2026-04-24 22:40:19 +0000 UTC" firstStartedPulling="2026-04-24 22:40:19.48843548 +0000 UTC m=+639.432226802" lastFinishedPulling="2026-04-24 22:40:21.138177022 +0000 UTC m=+641.081968343" observedRunningTime="2026-04-24 22:40:21.740645475 +0000 UTC m=+641.684436804" watchObservedRunningTime="2026-04-24 22:40:21.742346148 +0000 UTC m=+641.686137492" Apr 24 22:40:27.737467 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:27.737433 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" Apr 24 22:40:29.073286 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:29.073253 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg"] Apr 24 22:40:29.073778 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:29.073484 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" podUID="3cb48c49-1c88-4662-a812-ceca96ed4fc7" containerName="model-chainer-raw-d4694" containerID="cri-o://d209244221ab982960f6458623d85c5c08734654628733f028c45996877009fe" gracePeriod=30 Apr 24 22:40:32.736145 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:32.736102 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" podUID="3cb48c49-1c88-4662-a812-ceca96ed4fc7" containerName="model-chainer-raw-d4694" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:40:37.735842 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:37.735801 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" podUID="3cb48c49-1c88-4662-a812-ceca96ed4fc7" containerName="model-chainer-raw-d4694" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:40:42.735327 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:42.735286 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" podUID="3cb48c49-1c88-4662-a812-ceca96ed4fc7" containerName="model-chainer-raw-d4694" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:40:42.735783 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:42.735418 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" Apr 24 22:40:47.735359 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:47.735319 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" podUID="3cb48c49-1c88-4662-a812-ceca96ed4fc7" containerName="model-chainer-raw-d4694" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:40:52.735647 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:52.735610 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" podUID="3cb48c49-1c88-4662-a812-ceca96ed4fc7" containerName="model-chainer-raw-d4694" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:40:57.735626 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:57.735582 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" podUID="3cb48c49-1c88-4662-a812-ceca96ed4fc7" containerName="model-chainer-raw-d4694" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:40:59.102096 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:40:59.102042 2582 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cb48c49_1c88_4662_a812_ceca96ed4fc7.slice/crio-conmon-d209244221ab982960f6458623d85c5c08734654628733f028c45996877009fe.scope\": RecentStats: unable to find data in memory cache]" Apr 24 22:40:59.102096 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:40:59.102089 2582 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cb48c49_1c88_4662_a812_ceca96ed4fc7.slice/crio-conmon-d209244221ab982960f6458623d85c5c08734654628733f028c45996877009fe.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cb48c49_1c88_4662_a812_ceca96ed4fc7.slice/crio-d209244221ab982960f6458623d85c5c08734654628733f028c45996877009fe.scope\": RecentStats: unable to find data in memory cache]" Apr 24 22:40:59.716207 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:59.716185 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" Apr 24 22:40:59.818104 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:59.818038 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3cb48c49-1c88-4662-a812-ceca96ed4fc7-proxy-tls\") pod \"3cb48c49-1c88-4662-a812-ceca96ed4fc7\" (UID: \"3cb48c49-1c88-4662-a812-ceca96ed4fc7\") " Apr 24 22:40:59.818274 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:59.818112 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cb48c49-1c88-4662-a812-ceca96ed4fc7-openshift-service-ca-bundle\") pod \"3cb48c49-1c88-4662-a812-ceca96ed4fc7\" (UID: \"3cb48c49-1c88-4662-a812-ceca96ed4fc7\") " Apr 24 22:40:59.818483 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:59.818459 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb48c49-1c88-4662-a812-ceca96ed4fc7-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "3cb48c49-1c88-4662-a812-ceca96ed4fc7" (UID: "3cb48c49-1c88-4662-a812-ceca96ed4fc7"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:40:59.820486 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:59.820454 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cb48c49-1c88-4662-a812-ceca96ed4fc7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3cb48c49-1c88-4662-a812-ceca96ed4fc7" (UID: "3cb48c49-1c88-4662-a812-ceca96ed4fc7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:40:59.850403 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:59.850317 2582 generic.go:358] "Generic (PLEG): container finished" podID="3cb48c49-1c88-4662-a812-ceca96ed4fc7" containerID="d209244221ab982960f6458623d85c5c08734654628733f028c45996877009fe" exitCode=0 Apr 24 22:40:59.850403 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:59.850380 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" Apr 24 22:40:59.850559 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:59.850400 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" event={"ID":"3cb48c49-1c88-4662-a812-ceca96ed4fc7","Type":"ContainerDied","Data":"d209244221ab982960f6458623d85c5c08734654628733f028c45996877009fe"} Apr 24 22:40:59.850559 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:59.850438 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg" event={"ID":"3cb48c49-1c88-4662-a812-ceca96ed4fc7","Type":"ContainerDied","Data":"1fd50ffe818e4b7b3638c7e979372b4126eeba9e8fac70bf3eb7f325427f5484"} Apr 24 22:40:59.850559 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:59.850454 2582 scope.go:117] "RemoveContainer" containerID="d209244221ab982960f6458623d85c5c08734654628733f028c45996877009fe" Apr 24 22:40:59.858506 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:59.858487 2582 scope.go:117] "RemoveContainer" containerID="d209244221ab982960f6458623d85c5c08734654628733f028c45996877009fe" Apr 24 22:40:59.858761 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:40:59.858743 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d209244221ab982960f6458623d85c5c08734654628733f028c45996877009fe\": container with ID starting with d209244221ab982960f6458623d85c5c08734654628733f028c45996877009fe not found: ID does not exist" containerID="d209244221ab982960f6458623d85c5c08734654628733f028c45996877009fe" Apr 24 22:40:59.858817 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:59.858774 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d209244221ab982960f6458623d85c5c08734654628733f028c45996877009fe"} err="failed to get container status \"d209244221ab982960f6458623d85c5c08734654628733f028c45996877009fe\": rpc error: code = NotFound desc = could not find container \"d209244221ab982960f6458623d85c5c08734654628733f028c45996877009fe\": container with ID starting with d209244221ab982960f6458623d85c5c08734654628733f028c45996877009fe not found: ID does not exist" Apr 24 22:40:59.868851 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:59.868818 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg"] Apr 24 22:40:59.872379 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:59.872348 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-d4694-69669455cb-t9wzg"] Apr 24 22:40:59.919573 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:59.919548 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3cb48c49-1c88-4662-a812-ceca96ed4fc7-proxy-tls\") on node \"ip-10-0-135-222.ec2.internal\" DevicePath \"\"" Apr 24 22:40:59.919573 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:40:59.919571 2582 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cb48c49-1c88-4662-a812-ceca96ed4fc7-openshift-service-ca-bundle\") on node \"ip-10-0-135-222.ec2.internal\" DevicePath \"\"" Apr 24 22:41:00.637983 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:41:00.637948 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb48c49-1c88-4662-a812-ceca96ed4fc7" path="/var/lib/kubelet/pods/3cb48c49-1c88-4662-a812-ceca96ed4fc7/volumes" Apr 24 22:41:59.319201 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:41:59.319162 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5"] Apr 24 22:41:59.319591 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:41:59.319510 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3cb48c49-1c88-4662-a812-ceca96ed4fc7" containerName="model-chainer-raw-d4694" Apr 24 22:41:59.319591 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:41:59.319525 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cb48c49-1c88-4662-a812-ceca96ed4fc7" containerName="model-chainer-raw-d4694" Apr 24 22:41:59.319591 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:41:59.319579 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="3cb48c49-1c88-4662-a812-ceca96ed4fc7" containerName="model-chainer-raw-d4694" Apr 24 22:41:59.323669 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:41:59.323649 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" Apr 24 22:41:59.325387 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:41:59.325354 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-gtqzv\"" Apr 24 22:41:59.325498 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:41:59.325355 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 22:41:59.325498 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:41:59.325396 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-133b5-serving-cert\"" Apr 24 22:41:59.325612 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:41:59.325596 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-133b5-kube-rbac-proxy-sar-config\"" Apr 24 22:41:59.328841 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:41:59.328821 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5"] Apr 24 22:41:59.393968 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:41:59.393931 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30fc9e5b-d308-4796-8b2c-a69ec1e808ab-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5\" (UID: \"30fc9e5b-d308-4796-8b2c-a69ec1e808ab\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" Apr 24 22:41:59.394155 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:41:59.393980 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30fc9e5b-d308-4796-8b2c-a69ec1e808ab-proxy-tls\") pod \"model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5\" (UID: \"30fc9e5b-d308-4796-8b2c-a69ec1e808ab\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" Apr 24 22:41:59.495369 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:41:59.495337 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30fc9e5b-d308-4796-8b2c-a69ec1e808ab-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5\" (UID: \"30fc9e5b-d308-4796-8b2c-a69ec1e808ab\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" Apr 24 22:41:59.495533 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:41:59.495395 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30fc9e5b-d308-4796-8b2c-a69ec1e808ab-proxy-tls\") pod \"model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5\" (UID: \"30fc9e5b-d308-4796-8b2c-a69ec1e808ab\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" Apr 24 22:41:59.495976 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:41:59.495954 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30fc9e5b-d308-4796-8b2c-a69ec1e808ab-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5\" (UID: \"30fc9e5b-d308-4796-8b2c-a69ec1e808ab\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" Apr 24 22:41:59.498009 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:41:59.497989 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30fc9e5b-d308-4796-8b2c-a69ec1e808ab-proxy-tls\") pod \"model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5\" (UID: \"30fc9e5b-d308-4796-8b2c-a69ec1e808ab\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" Apr 24 22:41:59.636250 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:41:59.636161 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" Apr 24 22:41:59.758601 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:41:59.758445 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5"] Apr 24 22:41:59.761260 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:41:59.761227 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30fc9e5b_d308_4796_8b2c_a69ec1e808ab.slice/crio-cfec4835b7f14b9c85457257a491596977ab4c4c88f4c2aedbc8b9006ee64204 WatchSource:0}: Error finding container cfec4835b7f14b9c85457257a491596977ab4c4c88f4c2aedbc8b9006ee64204: Status 404 returned error can't find the container with id cfec4835b7f14b9c85457257a491596977ab4c4c88f4c2aedbc8b9006ee64204 Apr 24 22:41:59.762952 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:41:59.762936 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:42:00.041810 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:00.041771 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" event={"ID":"30fc9e5b-d308-4796-8b2c-a69ec1e808ab","Type":"ContainerStarted","Data":"e165f99e22a2a8118b4a781d4041cde3ed73dd3d302e036de59e703c11de2328"} Apr 24 22:42:00.041810 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:00.041809 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" event={"ID":"30fc9e5b-d308-4796-8b2c-a69ec1e808ab","Type":"ContainerStarted","Data":"cfec4835b7f14b9c85457257a491596977ab4c4c88f4c2aedbc8b9006ee64204"} Apr 24 22:42:00.042051 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:00.041834 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" Apr 24 22:42:00.057685 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:00.057633 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" podStartSLOduration=1.057618284 podStartE2EDuration="1.057618284s" podCreationTimestamp="2026-04-24 22:41:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:42:00.055487173 +0000 UTC m=+739.999278517" watchObservedRunningTime="2026-04-24 22:42:00.057618284 +0000 UTC m=+740.001409615" Apr 24 22:42:06.051365 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:06.051326 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" Apr 24 22:42:09.365919 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:09.365883 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5"] Apr 24 22:42:09.366386 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:09.366147 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" podUID="30fc9e5b-d308-4796-8b2c-a69ec1e808ab" containerName="model-chainer-raw-hpa-133b5" containerID="cri-o://e165f99e22a2a8118b4a781d4041cde3ed73dd3d302e036de59e703c11de2328" gracePeriod=30 Apr 24 22:42:11.049111 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:11.049037 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" podUID="30fc9e5b-d308-4796-8b2c-a69ec1e808ab" containerName="model-chainer-raw-hpa-133b5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:42:16.048628 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:16.048583 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" podUID="30fc9e5b-d308-4796-8b2c-a69ec1e808ab" containerName="model-chainer-raw-hpa-133b5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:42:21.049001 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:21.048959 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" podUID="30fc9e5b-d308-4796-8b2c-a69ec1e808ab" containerName="model-chainer-raw-hpa-133b5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:42:21.049423 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:21.049094 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" Apr 24 22:42:26.048781 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:26.048744 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" podUID="30fc9e5b-d308-4796-8b2c-a69ec1e808ab" containerName="model-chainer-raw-hpa-133b5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:42:31.050102 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:31.050043 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" podUID="30fc9e5b-d308-4796-8b2c-a69ec1e808ab" containerName="model-chainer-raw-hpa-133b5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:42:36.049450 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:36.049408 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" podUID="30fc9e5b-d308-4796-8b2c-a69ec1e808ab" containerName="model-chainer-raw-hpa-133b5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:42:39.512527 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:39.512503 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" Apr 24 22:42:39.619998 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:39.619911 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30fc9e5b-d308-4796-8b2c-a69ec1e808ab-openshift-service-ca-bundle\") pod \"30fc9e5b-d308-4796-8b2c-a69ec1e808ab\" (UID: \"30fc9e5b-d308-4796-8b2c-a69ec1e808ab\") " Apr 24 22:42:39.619998 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:39.619986 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30fc9e5b-d308-4796-8b2c-a69ec1e808ab-proxy-tls\") pod \"30fc9e5b-d308-4796-8b2c-a69ec1e808ab\" (UID: \"30fc9e5b-d308-4796-8b2c-a69ec1e808ab\") " Apr 24 22:42:39.620389 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:39.620358 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30fc9e5b-d308-4796-8b2c-a69ec1e808ab-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "30fc9e5b-d308-4796-8b2c-a69ec1e808ab" (UID: "30fc9e5b-d308-4796-8b2c-a69ec1e808ab"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:42:39.622250 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:39.622222 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30fc9e5b-d308-4796-8b2c-a69ec1e808ab-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "30fc9e5b-d308-4796-8b2c-a69ec1e808ab" (UID: "30fc9e5b-d308-4796-8b2c-a69ec1e808ab"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:42:39.720471 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:39.720424 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30fc9e5b-d308-4796-8b2c-a69ec1e808ab-proxy-tls\") on node \"ip-10-0-135-222.ec2.internal\" DevicePath \"\"" Apr 24 22:42:39.720471 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:39.720467 2582 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30fc9e5b-d308-4796-8b2c-a69ec1e808ab-openshift-service-ca-bundle\") on node \"ip-10-0-135-222.ec2.internal\" DevicePath \"\"" Apr 24 22:42:40.169148 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:40.169106 2582 generic.go:358] "Generic (PLEG): container finished" podID="30fc9e5b-d308-4796-8b2c-a69ec1e808ab" containerID="e165f99e22a2a8118b4a781d4041cde3ed73dd3d302e036de59e703c11de2328" exitCode=0 Apr 24 22:42:40.169312 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:40.169172 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" Apr 24 22:42:40.169312 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:40.169172 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" event={"ID":"30fc9e5b-d308-4796-8b2c-a69ec1e808ab","Type":"ContainerDied","Data":"e165f99e22a2a8118b4a781d4041cde3ed73dd3d302e036de59e703c11de2328"} Apr 24 22:42:40.169312 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:40.169277 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5" event={"ID":"30fc9e5b-d308-4796-8b2c-a69ec1e808ab","Type":"ContainerDied","Data":"cfec4835b7f14b9c85457257a491596977ab4c4c88f4c2aedbc8b9006ee64204"} Apr 24 22:42:40.169312 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:40.169299 2582 scope.go:117] "RemoveContainer" containerID="e165f99e22a2a8118b4a781d4041cde3ed73dd3d302e036de59e703c11de2328" Apr 24 22:42:40.177639 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:40.177621 2582 scope.go:117] "RemoveContainer" containerID="e165f99e22a2a8118b4a781d4041cde3ed73dd3d302e036de59e703c11de2328" Apr 24 22:42:40.177906 ip-10-0-135-222 kubenswrapper[2582]: E0424 22:42:40.177885 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e165f99e22a2a8118b4a781d4041cde3ed73dd3d302e036de59e703c11de2328\": container with ID starting with e165f99e22a2a8118b4a781d4041cde3ed73dd3d302e036de59e703c11de2328 not found: ID does not exist" containerID="e165f99e22a2a8118b4a781d4041cde3ed73dd3d302e036de59e703c11de2328" Apr 24 22:42:40.177957 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:40.177915 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e165f99e22a2a8118b4a781d4041cde3ed73dd3d302e036de59e703c11de2328"} err="failed to get container status \"e165f99e22a2a8118b4a781d4041cde3ed73dd3d302e036de59e703c11de2328\": rpc error: code = NotFound desc = could not find container \"e165f99e22a2a8118b4a781d4041cde3ed73dd3d302e036de59e703c11de2328\": container with ID starting with e165f99e22a2a8118b4a781d4041cde3ed73dd3d302e036de59e703c11de2328 not found: ID does not exist" Apr 24 22:42:40.187974 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:40.187947 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5"] Apr 24 22:42:40.193815 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:40.193793 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-133b5-7bdbb47f56-9svl5"] Apr 24 22:42:40.636470 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:42:40.636441 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30fc9e5b-d308-4796-8b2c-a69ec1e808ab" path="/var/lib/kubelet/pods/30fc9e5b-d308-4796-8b2c-a69ec1e808ab/volumes" Apr 24 22:44:40.569899 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:44:40.569867 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzsgr_239c26d8-bd64-4f99-9455-4fceceb609ee/ovn-acl-logging/0.log" Apr 24 22:44:40.573073 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:44:40.573039 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzsgr_239c26d8-bd64-4f99-9455-4fceceb609ee/ovn-acl-logging/0.log" Apr 24 22:49:40.590261 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:49:40.590234 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzsgr_239c26d8-bd64-4f99-9455-4fceceb609ee/ovn-acl-logging/0.log" Apr 24 22:49:40.594919 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:49:40.594897 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzsgr_239c26d8-bd64-4f99-9455-4fceceb609ee/ovn-acl-logging/0.log" Apr 24 22:50:42.460971 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:42.460941 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-zvc7f_bcb4810e-da56-4f18-8ac9-65765230513d/global-pull-secret-syncer/0.log" Apr 24 22:50:42.531244 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:42.531205 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-kg2xf_3c9ae31a-f5e8-444b-8692-a6e8b24d04ad/konnectivity-agent/0.log" Apr 24 22:50:42.601196 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:42.601164 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-222.ec2.internal_b28c4947c642c3a6ca28e65c847e2583/haproxy/0.log" Apr 24 22:50:46.408389 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:46.408357 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-kq27h_4a22403d-f328-4bbf-a4a4-63ec0876fc5a/monitoring-plugin/0.log" Apr 24 22:50:46.437531 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:46.437498 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-88ffs_cbe451bd-f49e-4fe0-8e55-929f1582a45d/node-exporter/0.log" Apr 24 22:50:46.456481 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:46.456450 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-88ffs_cbe451bd-f49e-4fe0-8e55-929f1582a45d/kube-rbac-proxy/0.log" Apr 24 22:50:46.483973 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:46.483946 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-88ffs_cbe451bd-f49e-4fe0-8e55-929f1582a45d/init-textfile/0.log" Apr 24 22:50:47.021228 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:47.021193 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-77497b7776-fcp8g_4f291ad5-a866-4e1b-a40c-5f48012a9705/thanos-query/0.log" Apr 24 22:50:47.052289 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:47.052255 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-77497b7776-fcp8g_4f291ad5-a866-4e1b-a40c-5f48012a9705/kube-rbac-proxy-web/0.log" Apr 24 22:50:47.079380 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:47.079350 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-77497b7776-fcp8g_4f291ad5-a866-4e1b-a40c-5f48012a9705/kube-rbac-proxy/0.log" Apr 24 22:50:47.104557 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:47.104516 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-77497b7776-fcp8g_4f291ad5-a866-4e1b-a40c-5f48012a9705/prom-label-proxy/0.log" Apr 24 22:50:47.129630 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:47.129605 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-77497b7776-fcp8g_4f291ad5-a866-4e1b-a40c-5f48012a9705/kube-rbac-proxy-rules/0.log" Apr 24 22:50:47.155431 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:47.155404 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-77497b7776-fcp8g_4f291ad5-a866-4e1b-a40c-5f48012a9705/kube-rbac-proxy-metrics/0.log" Apr 24 22:50:49.205961 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.205932 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-546sf_787faccd-4c44-470b-a814-aed056407d9a/download-server/0.log" Apr 24 22:50:49.342582 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.342550 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm"] Apr 24 22:50:49.342895 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.342883 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30fc9e5b-d308-4796-8b2c-a69ec1e808ab" containerName="model-chainer-raw-hpa-133b5" Apr 24 22:50:49.342943 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.342898 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="30fc9e5b-d308-4796-8b2c-a69ec1e808ab" containerName="model-chainer-raw-hpa-133b5" Apr 24 22:50:49.342976 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.342968 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="30fc9e5b-d308-4796-8b2c-a69ec1e808ab" containerName="model-chainer-raw-hpa-133b5" Apr 24 22:50:49.346196 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.346175 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm" Apr 24 22:50:49.347695 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.347675 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h9lwj\"/\"kube-root-ca.crt\"" Apr 24 22:50:49.348072 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.348037 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-h9lwj\"/\"default-dockercfg-nh675\"" Apr 24 22:50:49.348198 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.348038 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h9lwj\"/\"openshift-service-ca.crt\"" Apr 24 22:50:49.354180 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.354154 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm"] Apr 24 22:50:49.419760 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.419723 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a808542b-cfbd-41af-b5bb-dd814eba0a54-sys\") pod \"perf-node-gather-daemonset-sftsm\" (UID: \"a808542b-cfbd-41af-b5bb-dd814eba0a54\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm" Apr 24 22:50:49.419940 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.419773 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2szx\" (UniqueName: \"kubernetes.io/projected/a808542b-cfbd-41af-b5bb-dd814eba0a54-kube-api-access-g2szx\") pod \"perf-node-gather-daemonset-sftsm\" (UID: \"a808542b-cfbd-41af-b5bb-dd814eba0a54\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm" Apr 24 22:50:49.419940 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.419797 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a808542b-cfbd-41af-b5bb-dd814eba0a54-podres\") pod \"perf-node-gather-daemonset-sftsm\" (UID: \"a808542b-cfbd-41af-b5bb-dd814eba0a54\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm" Apr 24 22:50:49.419940 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.419840 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a808542b-cfbd-41af-b5bb-dd814eba0a54-lib-modules\") pod \"perf-node-gather-daemonset-sftsm\" (UID: \"a808542b-cfbd-41af-b5bb-dd814eba0a54\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm" Apr 24 22:50:49.419940 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.419909 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a808542b-cfbd-41af-b5bb-dd814eba0a54-proc\") pod \"perf-node-gather-daemonset-sftsm\" (UID: \"a808542b-cfbd-41af-b5bb-dd814eba0a54\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm" Apr 24 22:50:49.520651 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.520555 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2szx\" (UniqueName: \"kubernetes.io/projected/a808542b-cfbd-41af-b5bb-dd814eba0a54-kube-api-access-g2szx\") pod \"perf-node-gather-daemonset-sftsm\" (UID: \"a808542b-cfbd-41af-b5bb-dd814eba0a54\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm" Apr 24 22:50:49.520651 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.520603 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a808542b-cfbd-41af-b5bb-dd814eba0a54-podres\") pod \"perf-node-gather-daemonset-sftsm\" (UID: \"a808542b-cfbd-41af-b5bb-dd814eba0a54\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm" Apr 24 22:50:49.520651 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.520639 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a808542b-cfbd-41af-b5bb-dd814eba0a54-lib-modules\") pod \"perf-node-gather-daemonset-sftsm\" (UID: \"a808542b-cfbd-41af-b5bb-dd814eba0a54\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm" Apr 24 22:50:49.520898 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.520670 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a808542b-cfbd-41af-b5bb-dd814eba0a54-proc\") pod \"perf-node-gather-daemonset-sftsm\" (UID: \"a808542b-cfbd-41af-b5bb-dd814eba0a54\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm" Apr 24 22:50:49.520898 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.520688 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a808542b-cfbd-41af-b5bb-dd814eba0a54-sys\") pod \"perf-node-gather-daemonset-sftsm\" (UID: \"a808542b-cfbd-41af-b5bb-dd814eba0a54\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm" Apr 24 22:50:49.520898 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.520760 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a808542b-cfbd-41af-b5bb-dd814eba0a54-sys\") pod \"perf-node-gather-daemonset-sftsm\" (UID: \"a808542b-cfbd-41af-b5bb-dd814eba0a54\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm" Apr 24 22:50:49.520898 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.520793 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a808542b-cfbd-41af-b5bb-dd814eba0a54-proc\") pod \"perf-node-gather-daemonset-sftsm\" (UID: \"a808542b-cfbd-41af-b5bb-dd814eba0a54\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm" Apr 24 22:50:49.520898 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.520826 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a808542b-cfbd-41af-b5bb-dd814eba0a54-podres\") pod \"perf-node-gather-daemonset-sftsm\" (UID: \"a808542b-cfbd-41af-b5bb-dd814eba0a54\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm" Apr 24 22:50:49.520898 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.520826 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a808542b-cfbd-41af-b5bb-dd814eba0a54-lib-modules\") pod \"perf-node-gather-daemonset-sftsm\" (UID: \"a808542b-cfbd-41af-b5bb-dd814eba0a54\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm" Apr 24 22:50:49.527431 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.527407 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2szx\" (UniqueName: \"kubernetes.io/projected/a808542b-cfbd-41af-b5bb-dd814eba0a54-kube-api-access-g2szx\") pod \"perf-node-gather-daemonset-sftsm\" (UID: \"a808542b-cfbd-41af-b5bb-dd814eba0a54\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm" Apr 24 22:50:49.656588 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.656551 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm" Apr 24 22:50:49.779528 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.779450 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm"] Apr 24 22:50:49.782348 ip-10-0-135-222 kubenswrapper[2582]: W0424 22:50:49.782318 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda808542b_cfbd_41af_b5bb_dd814eba0a54.slice/crio-8a3df548a47c9b27d20fcd3480c63f7c4823045236be1271810e4402eb659df2 WatchSource:0}: Error finding container 8a3df548a47c9b27d20fcd3480c63f7c4823045236be1271810e4402eb659df2: Status 404 returned error can't find the container with id 8a3df548a47c9b27d20fcd3480c63f7c4823045236be1271810e4402eb659df2 Apr 24 22:50:49.783980 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:49.783964 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:50:50.272533 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:50.272499 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vp89s_d25320e2-53da-44c0-bfb0-8ed3f795faf6/dns/0.log" Apr 24 22:50:50.290695 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:50.290666 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vp89s_d25320e2-53da-44c0-bfb0-8ed3f795faf6/kube-rbac-proxy/0.log" Apr 24 22:50:50.399261 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:50.399227 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wtlmx_722d1910-3c2f-4e70-af24-daf9f78fcf06/dns-node-resolver/0.log" Apr 24 22:50:50.734843 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:50.734803 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm" event={"ID":"a808542b-cfbd-41af-b5bb-dd814eba0a54","Type":"ContainerStarted","Data":"7879619be261e9accede73fde6cce63f5fb559bf4513a8c50733ed5dd331a5bb"} Apr 24 22:50:50.734843 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:50.734845 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm" event={"ID":"a808542b-cfbd-41af-b5bb-dd814eba0a54","Type":"ContainerStarted","Data":"8a3df548a47c9b27d20fcd3480c63f7c4823045236be1271810e4402eb659df2"} Apr 24 22:50:50.735043 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:50.734939 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm" Apr 24 22:50:50.750311 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:50.750267 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm" podStartSLOduration=1.750253212 podStartE2EDuration="1.750253212s" podCreationTimestamp="2026-04-24 22:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:50:50.748244396 +0000 UTC m=+1270.692035727" watchObservedRunningTime="2026-04-24 22:50:50.750253212 +0000 UTC m=+1270.694044543" Apr 24 22:50:50.813480 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:50.813442 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5db684cb8c-4nqc9_4f2c2d47-f231-4b74-8af2-1181e5558d09/registry/0.log" Apr 24 22:50:50.830986 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:50.830952 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-j4t6h_4f665f36-3e6e-4199-bbcd-df474abfeb86/node-ca/0.log" Apr 24 22:50:51.929650 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:51.929618 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-qchcv_4fd3bad0-b406-44c2-b540-cbbaf1436e6d/serve-healthcheck-canary/0.log" Apr 24 22:50:52.442231 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:52.442208 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-s5brh_4ffa9d0c-ccdc-4b8c-b83f-12076db312b8/kube-rbac-proxy/0.log" Apr 24 22:50:52.463938 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:52.463909 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-s5brh_4ffa9d0c-ccdc-4b8c-b83f-12076db312b8/exporter/0.log" Apr 24 22:50:52.484819 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:52.484782 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-s5brh_4ffa9d0c-ccdc-4b8c-b83f-12076db312b8/extractor/0.log" Apr 24 22:50:54.424389 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:54.424352 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-549bc44c6d-l85mp_b2796c7a-f28e-4c0a-bdee-01d872204bc0/manager/0.log" Apr 24 22:50:54.442311 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:54.442269 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-ttmtv_e39acfc7-5c5e-4bdc-9800-c2740dcbb193/manager/0.log" Apr 24 22:50:54.540933 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:54.540895 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-c6rnt_dabe2de2-793c-48e3-90f8-b49ff5ec1fa9/manager/0.log" Apr 24 22:50:54.599411 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:54.599383 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-m2mvp_4cff1d5d-90c6-4f9b-8de2-4273e5a66ab7/seaweedfs/0.log" Apr 24 22:50:56.748613 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:56.748584 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-sftsm" Apr 24 22:50:58.383846 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:58.383769 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-ff4l2_ce0f15c5-e087-48f7-8284-7dcf34afed79/migrator/0.log" Apr 24 22:50:58.405302 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:58.405267 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-ff4l2_ce0f15c5-e087-48f7-8284-7dcf34afed79/graceful-termination/0.log" Apr 24 22:50:59.677709 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:50:59.677670 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9wphl_d4b8fbfd-f18c-4b29-9c01-547311bd0ba6/kube-multus/0.log" Apr 24 22:51:00.086124 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:51:00.086099 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z9q4w_44759cca-eeb7-4b34-af4d-65cef31d60a1/kube-multus-additional-cni-plugins/0.log" Apr 24 22:51:00.108862 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:51:00.108834 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z9q4w_44759cca-eeb7-4b34-af4d-65cef31d60a1/egress-router-binary-copy/0.log" Apr 24 22:51:00.128603 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:51:00.128582 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z9q4w_44759cca-eeb7-4b34-af4d-65cef31d60a1/cni-plugins/0.log" Apr 24 22:51:00.148881 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:51:00.148857 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z9q4w_44759cca-eeb7-4b34-af4d-65cef31d60a1/bond-cni-plugin/0.log" Apr 24 22:51:00.169931 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:51:00.169907 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z9q4w_44759cca-eeb7-4b34-af4d-65cef31d60a1/routeoverride-cni/0.log" Apr 24 22:51:00.191157 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:51:00.191129 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z9q4w_44759cca-eeb7-4b34-af4d-65cef31d60a1/whereabouts-cni-bincopy/0.log" Apr 24 22:51:00.211746 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:51:00.211707 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z9q4w_44759cca-eeb7-4b34-af4d-65cef31d60a1/whereabouts-cni/0.log" Apr 24 22:51:00.341211 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:51:00.341178 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hgvbb_171d0bdf-1d87-4aee-9fad-9c28075596bd/network-metrics-daemon/0.log" Apr 24 22:51:00.360459 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:51:00.360430 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hgvbb_171d0bdf-1d87-4aee-9fad-9c28075596bd/kube-rbac-proxy/0.log" Apr 24 22:51:01.361560 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:51:01.361526 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzsgr_239c26d8-bd64-4f99-9455-4fceceb609ee/ovn-controller/0.log" Apr 24 22:51:01.381628 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:51:01.381592 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzsgr_239c26d8-bd64-4f99-9455-4fceceb609ee/ovn-acl-logging/0.log" Apr 24 22:51:01.387236 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:51:01.387215 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzsgr_239c26d8-bd64-4f99-9455-4fceceb609ee/ovn-acl-logging/1.log" Apr 24 22:51:01.402556 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:51:01.402533 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzsgr_239c26d8-bd64-4f99-9455-4fceceb609ee/kube-rbac-proxy-node/0.log" Apr 24 22:51:01.421438 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:51:01.421408 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzsgr_239c26d8-bd64-4f99-9455-4fceceb609ee/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:51:01.443206 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:51:01.443184 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzsgr_239c26d8-bd64-4f99-9455-4fceceb609ee/northd/0.log" Apr 24 22:51:01.461852 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:51:01.461827 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzsgr_239c26d8-bd64-4f99-9455-4fceceb609ee/nbdb/0.log" Apr 24 22:51:01.510804 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:51:01.510781 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzsgr_239c26d8-bd64-4f99-9455-4fceceb609ee/sbdb/0.log" Apr 24 22:51:01.636629 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:51:01.636558 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dzsgr_239c26d8-bd64-4f99-9455-4fceceb609ee/ovnkube-controller/0.log" Apr 24 22:51:02.856817 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:51:02.856783 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-7sw9z_b5d8eefa-153f-46d6-8848-82778399a098/network-check-target-container/0.log" Apr 24 22:51:03.740314 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:51:03.740280 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-pqn8r_7cf45f35-3263-4dd4-83bf-caaac71acebd/iptables-alerter/0.log" Apr 24 22:51:04.393555 ip-10-0-135-222 kubenswrapper[2582]: I0424 22:51:04.393527 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-7pchr_f4851636-e409-4338-9170-49d3547b7af4/tuned/0.log"