Apr 22 19:21:01.432351 ip-10-0-129-145 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 19:21:01.432362 ip-10-0-129-145 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 19:21:01.432369 ip-10-0-129-145 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 19:21:01.432635 ip-10-0-129-145 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 19:21:11.548204 ip-10-0-129-145 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 19:21:11.548223 ip-10-0-129-145 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 1312ef681b6e46aaadbf0e7b433039c2 -- Apr 22 19:23:27.767125 ip-10-0-129-145 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:23:28.198693 ip-10-0-129-145 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:28.198693 ip-10-0-129-145 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:23:28.198693 ip-10-0-129-145 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:28.198693 ip-10-0-129-145 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:23:28.198693 ip-10-0-129-145 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:28.201702 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.201610 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:23:28.204481 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204466 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:28.204481 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204481 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:28.204566 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204485 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:28.204566 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204488 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:28.204566 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204492 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:28.204566 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204495 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:28.204566 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204498 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:28.204566 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204501 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:28.204566 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204504 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:28.204566 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204507 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:28.204566 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204509 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:28.204566 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204512 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:28.204566 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204515 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:28.204566 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204517 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:28.204566 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204521 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:28.204566 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204524 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:28.204566 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204527 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:28.204566 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204530 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:28.204566 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204533 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:28.204566 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204535 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:28.204566 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204538 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:28.204566 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204541 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:28.205057 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204543 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:28.205057 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204546 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:28.205057 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204548 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:28.205057 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204551 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:28.205057 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204554 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:28.205057 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204557 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:28.205057 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204560 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:28.205057 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204562 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:28.205057 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204565 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:28.205057 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204568 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:28.205057 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204572 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:28.205057 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204575 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:28.205057 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204578 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:28.205057 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204581 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:28.205057 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204584 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:28.205057 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204586 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:28.205057 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204589 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:28.205057 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204591 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:28.205057 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204594 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:28.205500 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204598 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:28.205500 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204602 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:28.205500 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204605 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:28.205500 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204607 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:28.205500 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204610 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:28.205500 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204613 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:28.205500 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204622 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:28.205500 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204626 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:28.205500 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204628 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:28.205500 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204631 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:28.205500 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204634 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:28.205500 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204636 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:28.205500 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204639 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:28.205500 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204641 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:28.205500 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204645 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:28.205500 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204648 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:28.205500 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204651 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:28.205500 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204653 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:28.205500 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204656 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:28.206011 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204659 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:28.206011 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204661 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:28.206011 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204664 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:28.206011 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204669 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:28.206011 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204673 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:28.206011 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204676 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:28.206011 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204680 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:28.206011 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204683 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:28.206011 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204685 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:28.206011 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204688 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:28.206011 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204691 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:28.206011 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204693 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:28.206011 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204696 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:28.206011 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204698 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:28.206011 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204701 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:28.206011 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204704 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:28.206011 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204706 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:28.206011 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204709 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:28.206011 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204711 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:28.206461 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204714 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:28.206461 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204717 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:28.206461 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204719 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:28.206461 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204722 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:28.206461 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204725 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:28.206461 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204727 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:28.206461 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.204730 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:28.206461 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205125 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:28.206461 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205131 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:28.206461 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205136 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:28.206461 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205140 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:28.206461 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205145 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:28.206461 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205149 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:28.206461 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205152 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:28.206461 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205156 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:28.206461 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205159 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:28.206461 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205161 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:28.206461 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205164 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:28.206461 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205166 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:28.206935 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205169 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:28.206935 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205172 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:28.206935 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205174 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:28.206935 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205177 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:28.206935 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205179 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:28.206935 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205182 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:28.206935 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205185 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:28.206935 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205187 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:28.206935 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205190 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:28.206935 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205192 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:28.206935 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205195 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:28.206935 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205198 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:28.206935 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205201 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:28.206935 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205204 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:28.206935 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205207 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:28.206935 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205210 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:28.206935 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205212 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:28.206935 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205215 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:28.206935 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205218 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:28.206935 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205220 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:28.207435 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205223 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:28.207435 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205227 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:28.207435 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205230 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:28.207435 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205232 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:28.207435 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205235 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:28.207435 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205237 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:28.207435 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205240 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:28.207435 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205242 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:28.207435 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205245 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:28.207435 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205248 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:28.207435 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205250 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:28.207435 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205253 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:28.207435 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205255 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:28.207435 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205258 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:28.207435 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205260 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:28.207435 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205263 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:28.207435 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205265 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:28.207435 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205268 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:28.207435 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205270 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:28.207435 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205273 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:28.207926 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205275 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:28.207926 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205278 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:28.207926 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205280 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:28.207926 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205283 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:28.207926 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205286 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:28.207926 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205289 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:28.207926 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205292 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:28.207926 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205295 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:28.207926 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205297 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:28.207926 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205299 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:28.207926 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205302 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:28.207926 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205304 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:28.207926 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205307 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:28.207926 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205310 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:28.207926 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205312 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:28.207926 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205315 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:28.207926 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205318 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:28.207926 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205320 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:28.207926 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205323 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:28.207926 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205325 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:28.208427 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205328 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:28.208427 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205330 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:28.208427 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205333 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:28.208427 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205336 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:28.208427 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205338 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:28.208427 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205341 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:28.208427 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205344 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:28.208427 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205346 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:28.208427 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205349 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:28.208427 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205351 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:28.208427 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205353 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:28.208427 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205356 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:28.208427 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205359 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:28.208427 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.205361 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:28.208427 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206518 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:23:28.208427 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206529 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:23:28.208427 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206535 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:23:28.208427 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206541 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:23:28.208427 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206546 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:23:28.208427 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206550 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:23:28.208427 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206555 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:23:28.208966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206560 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:23:28.208966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206563 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:23:28.208966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206566 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:23:28.208966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206570 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:23:28.208966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206573 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:23:28.208966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206577 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:23:28.208966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206580 2574 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:23:28.208966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206583 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:23:28.208966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206587 2574 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:23:28.208966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206590 2574 flags.go:64] FLAG: --cloud-config="" Apr 22 19:23:28.208966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206594 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:23:28.208966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206597 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:23:28.208966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206601 2574 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:23:28.208966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206604 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:23:28.208966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206608 2574 flags.go:64] FLAG: --config-dir="" Apr 22 19:23:28.208966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206610 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:23:28.208966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206614 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:23:28.208966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206618 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:23:28.208966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206621 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:23:28.208966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206625 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:23:28.208966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206628 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:23:28.208966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206631 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:23:28.208966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206634 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:23:28.208966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206637 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206640 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206643 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206648 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206651 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206654 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206658 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206661 2574 flags.go:64] FLAG: --enable-server="true" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206664 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206669 2574 flags.go:64] FLAG: --event-burst="100" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206672 2574 flags.go:64] FLAG: --event-qps="50" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206676 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206680 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206683 2574 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206687 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206689 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206693 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206696 2574 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206699 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206702 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206705 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206708 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206712 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206715 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206717 2574 flags.go:64] FLAG: --feature-gates="" Apr 22 19:23:28.209542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206721 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:23:28.210167 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206724 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:23:28.210167 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206728 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:23:28.210167 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206731 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:23:28.210167 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206734 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:23:28.210167 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206738 2574 flags.go:64] FLAG: --help="false" Apr 22 19:23:28.210167 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206741 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-129-145.ec2.internal" Apr 22 19:23:28.210167 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206744 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:23:28.210167 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206748 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:23:28.210167 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206751 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:23:28.210167 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206754 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:23:28.210167 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206758 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:23:28.210167 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206761 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:23:28.210167 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206764 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:23:28.210167 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206767 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:23:28.210167 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206770 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:23:28.210167 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206773 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:23:28.210167 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206777 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:23:28.210167 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206780 2574 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:23:28.210167 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206783 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:23:28.210167 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206786 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:23:28.210167 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206789 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:23:28.210167 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206792 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:23:28.210167 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206795 2574 flags.go:64] FLAG: --lock-file="" Apr 22 19:23:28.210167 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206808 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:23:28.210730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206812 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:23:28.210730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206815 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:23:28.210730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206820 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:23:28.210730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206823 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:23:28.210730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206826 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:23:28.210730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206829 2574 flags.go:64] FLAG: --logging-format="text" Apr 22 19:23:28.210730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206832 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:23:28.210730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206836 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:23:28.210730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206839 2574 flags.go:64] FLAG: --manifest-url="" Apr 22 19:23:28.210730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206842 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:23:28.210730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206846 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:23:28.210730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206850 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:23:28.210730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206854 2574 flags.go:64] FLAG: --max-pods="110" Apr 22 19:23:28.210730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206857 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:23:28.210730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206861 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:23:28.210730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206863 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:23:28.210730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206867 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:23:28.210730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206870 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:23:28.210730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206873 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:23:28.210730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206877 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:23:28.210730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206884 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:23:28.210730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206888 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:23:28.210730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206891 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:23:28.210730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206894 2574 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206897 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206906 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206910 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206913 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206916 2574 flags.go:64] FLAG: --port="10250" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206919 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206922 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c1e1e2c57336bf0d" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206925 2574 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206929 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206932 2574 flags.go:64] FLAG: --register-node="true" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206935 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206938 2574 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206942 2574 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206945 2574 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206948 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206951 2574 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206955 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206958 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206961 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206967 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206970 2574 flags.go:64] FLAG: --runonce="false" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206973 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206976 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206979 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:23:28.211346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206982 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206985 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206988 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206991 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206994 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.206997 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.207000 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.207003 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.207006 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.207009 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.207013 2574 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.207016 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.207022 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.207025 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.207028 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.207033 2574 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.207036 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.207039 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.207042 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.207045 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.207048 2574 flags.go:64] FLAG: --v="2" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.207053 2574 flags.go:64] FLAG: --version="false" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.207057 2574 flags.go:64] FLAG: --vmodule="" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.207062 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.207065 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:23:28.211949 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207162 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:28.212567 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207166 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:28.212567 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207171 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:28.212567 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207174 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:28.212567 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207177 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:28.212567 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207180 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:28.212567 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207182 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:28.212567 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207185 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:28.212567 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207189 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:28.212567 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207193 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:28.212567 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207196 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:28.212567 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207199 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:28.212567 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207201 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:28.212567 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207204 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:28.212567 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207207 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:28.212567 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207210 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:28.212567 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207213 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:28.212567 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207215 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:28.212567 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207218 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:28.212567 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207221 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:28.213115 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207224 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:28.213115 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207226 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:28.213115 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207229 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:28.213115 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207231 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:28.213115 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207234 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:28.213115 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207237 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:28.213115 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207239 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:28.213115 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207242 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:28.213115 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207245 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:28.213115 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207247 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:28.213115 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207250 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:28.213115 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207252 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:28.213115 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207255 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:28.213115 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207257 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:28.213115 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207261 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:28.213115 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207264 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:28.213115 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207266 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:28.213115 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207269 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:28.213115 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207271 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:28.213115 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207274 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:28.213622 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207276 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:28.213622 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207279 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:28.213622 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207281 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:28.213622 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207284 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:28.213622 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207287 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:28.213622 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207289 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:28.213622 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207291 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:28.213622 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207294 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:28.213622 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207299 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:28.213622 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207302 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:28.213622 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207305 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:28.213622 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207308 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:28.213622 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207318 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:28.213622 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207321 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:28.213622 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207324 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:28.213622 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207327 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:28.213622 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207330 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:28.213622 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207332 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:28.213622 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207335 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:28.213622 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207337 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:28.214212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207340 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:28.214212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207342 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:28.214212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207345 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:28.214212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207348 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:28.214212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207351 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:28.214212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207354 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:28.214212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207358 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:28.214212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207360 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:28.214212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207363 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:28.214212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207365 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:28.214212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207368 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:28.214212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207371 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:28.214212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207374 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:28.214212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207377 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:28.214212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207380 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:28.214212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207382 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:28.214212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207385 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:28.214212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207388 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:28.214212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207390 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:28.214212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207392 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:28.214698 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207396 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:28.214698 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207399 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:28.214698 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207402 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:28.214698 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207404 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:28.214698 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207407 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:28.214698 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.207410 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:28.214698 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.207923 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:28.218047 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.218022 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:23:28.218047 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.218046 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:23:28.218212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218117 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:28.218212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218125 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:28.218212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218129 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:28.218212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218133 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:28.218212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218137 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:28.218212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218144 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:28.218212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218151 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:28.218212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218156 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:28.218212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218160 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:28.218212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218165 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:28.218212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218169 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:28.218212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218173 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:28.218212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218176 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:28.218212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218181 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:28.218212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218185 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:28.218212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218189 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:28.218212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218194 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:28.218212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218198 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:28.218212 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218202 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:28.219064 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218206 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:28.219064 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218211 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:28.219064 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218215 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:28.219064 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218220 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:28.219064 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218224 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:28.219064 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218229 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:28.219064 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218233 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:28.219064 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218237 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:28.219064 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218241 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:28.219064 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218245 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:28.219064 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218250 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:28.219064 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218254 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:28.219064 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218259 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:28.219064 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218263 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:28.219064 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218267 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:28.219064 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218271 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:28.219064 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218275 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:28.219064 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218279 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:28.219064 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218283 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:28.219064 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218287 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:28.219664 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218291 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:28.219664 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218295 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:28.219664 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218299 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:28.219664 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218304 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:28.219664 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218308 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:28.219664 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218312 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:28.219664 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218317 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:28.219664 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218321 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:28.219664 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218325 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:28.219664 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218329 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:28.219664 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218333 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:28.219664 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218337 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:28.219664 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218341 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:28.219664 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218347 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:28.219664 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218351 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:28.219664 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218355 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:28.219664 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218359 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:28.219664 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218363 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:28.219664 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218368 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:28.220269 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218374 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:28.220269 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218380 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:28.220269 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218385 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:28.220269 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218389 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:28.220269 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218394 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:28.220269 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218398 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:28.220269 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218402 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:28.220269 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218407 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:28.220269 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218411 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:28.220269 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218415 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:28.220269 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218419 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:28.220269 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218423 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:28.220269 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218427 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:28.220269 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218431 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:28.220269 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218435 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:28.220269 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218440 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:28.220269 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218443 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:28.220269 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218448 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:28.220269 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218453 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:28.220269 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218457 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:28.221039 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218461 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:28.221039 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218465 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:28.221039 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218469 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:28.221039 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218474 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:28.221039 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218478 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:28.221039 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218482 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:28.221039 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218487 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:28.221039 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218491 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:28.221039 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.218500 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:28.221039 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218687 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:28.221039 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218696 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:28.221039 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218701 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:28.221039 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218705 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:28.221039 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218710 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:28.221039 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218714 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:28.221423 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218719 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:28.221423 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218724 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:28.221423 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218728 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:28.221423 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218732 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:28.221423 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218737 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:28.221423 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218741 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:28.221423 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218745 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:28.221423 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218749 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:28.221423 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218753 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:28.221423 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218758 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:28.221423 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218763 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:28.221423 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218767 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:28.221423 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218771 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:28.221423 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218775 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:28.221423 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218779 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:28.221423 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218783 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:28.221423 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218788 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:28.221423 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218793 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:28.221423 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218813 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:28.221423 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218818 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:28.222018 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218823 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:28.222018 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218830 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:28.222018 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218837 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:28.222018 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218842 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:28.222018 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218846 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:28.222018 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218851 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:28.222018 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218855 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:28.222018 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218859 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:28.222018 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218863 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:28.222018 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218868 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:28.222018 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218872 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:28.222018 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218876 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:28.222018 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218880 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:28.222018 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218885 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:28.222018 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218889 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:28.222018 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218893 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:28.222018 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218897 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:28.222018 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218902 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:28.222018 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218906 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:28.222018 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218910 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:28.222510 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218914 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:28.222510 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218918 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:28.222510 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218923 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:28.222510 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218927 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:28.222510 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218931 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:28.222510 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218935 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:28.222510 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218939 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:28.222510 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218943 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:28.222510 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218947 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:28.222510 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218953 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:28.222510 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218957 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:28.222510 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218961 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:28.222510 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218968 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:28.222510 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218974 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:28.222510 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218979 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:28.222510 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218984 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:28.222510 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218988 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:28.222510 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218993 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:28.222510 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.218998 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:28.223015 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.219003 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:28.223015 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.219008 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:28.223015 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.219012 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:28.223015 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.219017 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:28.223015 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.219021 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:28.223015 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.219026 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:28.223015 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.219030 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:28.223015 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.219034 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:28.223015 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.219039 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:28.223015 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.219043 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:28.223015 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.219047 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:28.223015 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.219051 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:28.223015 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.219055 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:28.223015 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.219059 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:28.223015 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.219063 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:28.223015 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.219067 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:28.223015 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.219071 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:28.223015 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.219075 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:28.223015 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.219079 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:28.223015 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.219083 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:28.223845 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:28.219088 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:28.223845 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.219096 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:28.223845 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.219759 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:23:28.224624 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.224609 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:23:28.225433 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.225421 2574 server.go:1019] "Starting client certificate rotation" Apr 22 19:23:28.225544 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.225515 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:23:28.225583 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.225552 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:23:28.251579 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.251552 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:23:28.256022 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.256001 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:23:28.272587 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.272565 2574 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:23:28.277941 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.277917 2574 log.go:25] "Validated CRI v1 image API" Apr 22 19:23:28.279222 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.279200 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:23:28.281405 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.281386 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:23:28.283578 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.283309 2574 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 93d7358c-c5f9-4d86-a0e9-c361c802fcc2:/dev/nvme0n1p4 cf7637df-b669-47a9-96aa-4e9c8954303d:/dev/nvme0n1p3] Apr 22 19:23:28.283578 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.283387 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:23:28.290220 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.289979 2574 manager.go:217] Machine: {Timestamp:2026-04-22 19:23:28.288091518 +0000 UTC m=+0.407470818 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100362 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec20c3f0ef407b7c04ee1c7bd7569297 SystemUUID:ec20c3f0-ef40-7b7c-04ee-1c7bd7569297 BootID:1312ef68-1b6e-46aa-adbf-0e7b433039c2 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:7e:d0:50:6b:37 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:7e:d0:50:6b:37 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:22:09:05:7c:a1:d4 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:23:28.290220 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.290208 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:23:28.290396 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.290319 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:23:28.291462 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.291437 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:23:28.291664 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.291464 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-145.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:23:28.291749 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.291678 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:23:28.291749 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.291692 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:23:28.291749 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.291710 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:23:28.294599 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.294585 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:23:28.295900 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.295886 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:23:28.296027 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.296016 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:23:28.298405 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.298393 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:23:28.298475 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.298412 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:23:28.298475 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.298427 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:23:28.298475 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.298441 2574 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:23:28.298475 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.298454 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:23:28.299547 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.299533 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:23:28.299623 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.299556 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:23:28.302277 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.302254 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:23:28.303247 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.303230 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-p2pjm" Apr 22 19:23:28.303636 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.303623 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:23:28.305474 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.305461 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:23:28.305538 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.305479 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:23:28.305538 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.305486 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:23:28.305538 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.305491 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:23:28.305538 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.305498 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:23:28.305538 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.305504 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:23:28.305538 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.305510 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:23:28.305538 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.305516 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:23:28.305538 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.305524 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:23:28.305538 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.305530 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:23:28.305538 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.305543 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:23:28.305790 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.305552 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:23:28.307158 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.307148 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:23:28.307158 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.307158 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:23:28.310597 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.310568 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:23:28.310690 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.310609 2574 server.go:1295] "Started kubelet" Apr 22 19:23:28.310746 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.310686 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:23:28.310790 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.310729 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:23:28.310915 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.310824 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:23:28.311607 ip-10-0-129-145 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:23:28.311816 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.311777 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-145.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 19:23:28.311915 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:28.311795 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 19:23:28.311915 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.311830 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:23:28.312008 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:28.311957 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-145.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 19:23:28.312851 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.312836 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:23:28.316201 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.316184 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-p2pjm" Apr 22 19:23:28.316474 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.316458 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:23:28.316975 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.316959 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:23:28.318848 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.318701 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:23:28.318929 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.318856 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:23:28.318929 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:28.318836 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-145.ec2.internal\" not found" Apr 22 19:23:28.319042 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.318960 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:23:28.319042 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.318972 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:23:28.319127 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.319080 2574 factory.go:55] Registering systemd factory Apr 22 19:23:28.319127 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.319095 2574 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:23:28.319598 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.319581 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:23:28.319751 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.319730 2574 factory.go:153] Registering CRI-O factory Apr 22 19:23:28.319751 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.319753 2574 factory.go:223] Registration of the crio container factory successfully Apr 22 19:23:28.319916 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.319840 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:23:28.319916 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.319867 2574 factory.go:103] Registering Raw factory Apr 22 19:23:28.319916 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.319889 2574 manager.go:1196] Started watching for new ooms in manager Apr 22 19:23:28.320371 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.320355 2574 manager.go:319] Starting recovery of all containers Apr 22 19:23:28.320501 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:28.319601 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-145.ec2.internal.18a8c43a72c8f0f2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-145.ec2.internal,UID:ip-10-0-129-145.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-145.ec2.internal,},FirstTimestamp:2026-04-22 19:23:28.310579442 +0000 UTC m=+0.429958741,LastTimestamp:2026-04-22 19:23:28.310579442 +0000 UTC m=+0.429958741,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-145.ec2.internal,}" Apr 22 19:23:28.322475 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:28.322454 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 19:23:28.327589 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.327570 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:28.330096 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:28.330071 2574 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-145.ec2.internal\" not found" node="ip-10-0-129-145.ec2.internal" Apr 22 19:23:28.331893 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.331874 2574 manager.go:324] Recovery completed Apr 22 19:23:28.333078 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:28.333056 2574 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 19:23:28.336224 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.336213 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:28.338485 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.338467 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-145.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:28.338571 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.338497 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-145.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:28.338571 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.338512 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-145.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:28.339040 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.339027 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:23:28.339040 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.339038 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:23:28.339140 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.339055 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:23:28.342307 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.342295 2574 policy_none.go:49] "None policy: Start" Apr 22 19:23:28.342341 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.342312 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:23:28.342341 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.342323 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:23:28.374429 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.374411 2574 manager.go:341] "Starting Device Plugin manager" Apr 22 19:23:28.400833 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:28.374450 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:23:28.400833 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.374461 2574 server.go:85] "Starting device plugin registration server" Apr 22 19:23:28.400833 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.374681 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:23:28.400833 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.374691 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:23:28.400833 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.375240 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:23:28.400833 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.375334 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:23:28.400833 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.375343 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:23:28.400833 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:28.375597 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:23:28.400833 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:28.375636 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-145.ec2.internal\" not found" Apr 22 19:23:28.422911 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.422879 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:23:28.424139 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.424114 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:23:28.424224 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.424147 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:23:28.424224 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.424170 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:23:28.424224 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.424180 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:23:28.424224 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:28.424220 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:23:28.426781 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.426762 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:28.475366 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.475291 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:28.476432 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.476406 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-145.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:28.476432 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.476436 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-145.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:28.476553 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.476448 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-145.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:28.476553 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.476471 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-145.ec2.internal" Apr 22 19:23:28.484254 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.484238 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-145.ec2.internal" Apr 22 19:23:28.484309 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:28.484262 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-145.ec2.internal\": node \"ip-10-0-129-145.ec2.internal\" not found" Apr 22 19:23:28.495598 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:28.495579 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-145.ec2.internal\" not found" Apr 22 19:23:28.524411 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.524375 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-145.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-145.ec2.internal"] Apr 22 19:23:28.524482 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.524452 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:28.525301 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.525284 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-145.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:28.525378 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.525313 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-145.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:28.525378 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.525323 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-145.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:28.527706 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.527693 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:28.527862 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.527848 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-145.ec2.internal" Apr 22 19:23:28.527903 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.527877 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:28.528344 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.528330 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-145.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:28.528414 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.528354 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-145.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:28.528414 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.528364 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-145.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:28.528414 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.528337 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-145.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:28.528509 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.528424 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-145.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:28.528509 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.528437 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-145.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:28.530504 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.530488 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-145.ec2.internal" Apr 22 19:23:28.530597 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.530511 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:28.531134 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.531118 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-145.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:28.531224 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.531143 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-145.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:28.531224 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.531154 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-145.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:28.556464 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:28.556438 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-145.ec2.internal\" not found" node="ip-10-0-129-145.ec2.internal" Apr 22 19:23:28.560853 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:28.560827 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-145.ec2.internal\" not found" node="ip-10-0-129-145.ec2.internal" Apr 22 19:23:28.596370 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:28.596347 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-145.ec2.internal\" not found" Apr 22 19:23:28.697251 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:28.697221 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-145.ec2.internal\" not found" Apr 22 19:23:28.719506 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.719481 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c78ed32340ce7d1e488ffab8bbfb0412-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-145.ec2.internal\" (UID: \"c78ed32340ce7d1e488ffab8bbfb0412\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-145.ec2.internal" Apr 22 19:23:28.719597 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.719514 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c78ed32340ce7d1e488ffab8bbfb0412-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-145.ec2.internal\" (UID: \"c78ed32340ce7d1e488ffab8bbfb0412\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-145.ec2.internal" Apr 22 19:23:28.719597 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.719532 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b71282cca731aa5ddeb9357344f86ebf-config\") pod \"kube-apiserver-proxy-ip-10-0-129-145.ec2.internal\" (UID: \"b71282cca731aa5ddeb9357344f86ebf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-145.ec2.internal" Apr 22 19:23:28.797826 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:28.797731 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-145.ec2.internal\" not found" Apr 22 19:23:28.820124 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.820101 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c78ed32340ce7d1e488ffab8bbfb0412-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-145.ec2.internal\" (UID: \"c78ed32340ce7d1e488ffab8bbfb0412\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-145.ec2.internal" Apr 22 19:23:28.820180 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.820132 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c78ed32340ce7d1e488ffab8bbfb0412-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-145.ec2.internal\" (UID: \"c78ed32340ce7d1e488ffab8bbfb0412\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-145.ec2.internal" Apr 22 19:23:28.820180 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.820149 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b71282cca731aa5ddeb9357344f86ebf-config\") pod \"kube-apiserver-proxy-ip-10-0-129-145.ec2.internal\" (UID: \"b71282cca731aa5ddeb9357344f86ebf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-145.ec2.internal" Apr 22 19:23:28.820269 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.820216 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b71282cca731aa5ddeb9357344f86ebf-config\") pod \"kube-apiserver-proxy-ip-10-0-129-145.ec2.internal\" (UID: \"b71282cca731aa5ddeb9357344f86ebf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-145.ec2.internal" Apr 22 19:23:28.820269 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.820221 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c78ed32340ce7d1e488ffab8bbfb0412-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-145.ec2.internal\" (UID: \"c78ed32340ce7d1e488ffab8bbfb0412\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-145.ec2.internal" Apr 22 19:23:28.820269 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.820223 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c78ed32340ce7d1e488ffab8bbfb0412-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-145.ec2.internal\" (UID: \"c78ed32340ce7d1e488ffab8bbfb0412\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-145.ec2.internal" Apr 22 19:23:28.858191 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.858171 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-145.ec2.internal" Apr 22 19:23:28.863738 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:28.863721 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-145.ec2.internal" Apr 22 19:23:28.898156 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:28.898123 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-145.ec2.internal\" not found" Apr 22 19:23:28.998582 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:28.998547 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-145.ec2.internal\" not found" Apr 22 19:23:29.098996 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:29.098921 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-145.ec2.internal\" not found" Apr 22 19:23:29.157275 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:29.157249 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:29.199871 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:29.199848 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-145.ec2.internal\" not found" Apr 22 19:23:29.225293 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:29.225276 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:23:29.225428 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:29.225410 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:23:29.225464 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:29.225430 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:23:29.225496 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:29.225457 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:23:29.300743 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:29.300722 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-145.ec2.internal\" not found" Apr 22 19:23:29.317028 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:29.317005 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:23:29.321926 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:29.321893 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:18:28 +0000 UTC" deadline="2027-09-27 22:40:11.861770269 +0000 UTC" Apr 22 19:23:29.321926 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:29.321922 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12555h16m42.539850646s" Apr 22 19:23:29.332645 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:29.332618 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:23:29.355735 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:29.355666 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-sq7p8" Apr 22 19:23:29.362415 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:29.362389 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-sq7p8" Apr 22 19:23:29.401097 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:29.401072 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-145.ec2.internal\" not found" Apr 22 19:23:29.428833 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:29.428787 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb71282cca731aa5ddeb9357344f86ebf.slice/crio-7ccec5c6b6eb0b4cfd8dd829fb6840bcb335b9b261d576a99a76ff44d8b71b96 WatchSource:0}: Error finding container 7ccec5c6b6eb0b4cfd8dd829fb6840bcb335b9b261d576a99a76ff44d8b71b96: Status 404 returned error can't find the container with id 7ccec5c6b6eb0b4cfd8dd829fb6840bcb335b9b261d576a99a76ff44d8b71b96 Apr 22 19:23:29.429348 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:29.429330 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc78ed32340ce7d1e488ffab8bbfb0412.slice/crio-3b8b1af425586c89e2f350b89dcbcb19d36c81157c96eaef18b4b34c0ba12ca6 WatchSource:0}: Error finding container 3b8b1af425586c89e2f350b89dcbcb19d36c81157c96eaef18b4b34c0ba12ca6: Status 404 returned error can't find the container with id 3b8b1af425586c89e2f350b89dcbcb19d36c81157c96eaef18b4b34c0ba12ca6 Apr 22 19:23:29.434529 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:29.434516 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:23:29.435534 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:29.435518 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:29.519305 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:29.519261 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-145.ec2.internal" Apr 22 19:23:29.532594 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:29.532570 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:23:29.533366 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:29.533354 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-145.ec2.internal" Apr 22 19:23:29.542899 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:29.542882 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:23:30.067155 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.067121 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:30.299432 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.299169 2574 apiserver.go:52] "Watching apiserver" Apr 22 19:23:30.308618 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.308582 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:23:30.310627 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.310597 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-7kfks","openshift-image-registry/node-ca-jkcf2","openshift-network-operator/iptables-alerter-85tpl","openshift-ovn-kubernetes/ovnkube-node-rsnsl","kube-system/kube-apiserver-proxy-ip-10-0-129-145.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg","openshift-dns/node-resolver-cpzbb","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-145.ec2.internal","openshift-multus/multus-8z9zt","openshift-multus/multus-additional-cni-plugins-dcctc","openshift-multus/network-metrics-daemon-nndbq","openshift-network-diagnostics/network-check-target-bllx4","kube-system/konnectivity-agent-42mdb"] Apr 22 19:23:30.313365 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.313341 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cpzbb" Apr 22 19:23:30.315702 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.315676 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jkcf2" Apr 22 19:23:30.316576 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.316533 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qgw28\"" Apr 22 19:23:30.316680 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.316575 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:23:30.316680 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.316572 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:23:30.319413 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.319091 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-85tpl" Apr 22 19:23:30.321034 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.319883 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-d7znh\"" Apr 22 19:23:30.321034 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.319943 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:23:30.321034 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.320107 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:23:30.321034 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.320272 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:23:30.321991 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.321515 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:30.321991 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.321558 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gsd26\"" Apr 22 19:23:30.321991 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.321943 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:30.322174 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.322021 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:23:30.324218 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.324193 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" Apr 22 19:23:30.326677 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.326516 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-xgnxp\"" Apr 22 19:23:30.326677 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.326558 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:23:30.326677 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.326582 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:23:30.326901 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.326749 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:23:30.327557 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.327518 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.328024 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.327999 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/91147599-bdf0-49f5-98ed-a3567eaf56db-serviceca\") pod \"node-ca-jkcf2\" (UID: \"91147599-bdf0-49f5-98ed-a3567eaf56db\") " pod="openshift-image-registry/node-ca-jkcf2" Apr 22 19:23:30.328115 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328040 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7dbaab45-2adf-4e5c-b969-f8d3eb83ea37-iptables-alerter-script\") pod \"iptables-alerter-85tpl\" (UID: \"7dbaab45-2adf-4e5c-b969-f8d3eb83ea37\") " pod="openshift-network-operator/iptables-alerter-85tpl" Apr 22 19:23:30.328115 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328064 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-run-systemd\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.328115 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328073 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.328115 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328086 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-host-cni-netd\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.328313 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328113 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-ovn-node-metrics-cert\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.328313 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328137 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b1c0b054-80a0-4cc5-b053-a4d99268aa8f-tmp-dir\") pod \"node-resolver-cpzbb\" (UID: \"b1c0b054-80a0-4cc5-b053-a4d99268aa8f\") " pod="openshift-dns/node-resolver-cpzbb" Apr 22 19:23:30.328313 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328160 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-host-kubelet\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.328313 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328182 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-var-lib-openvswitch\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.328313 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328207 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-etc-openvswitch\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.328313 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328230 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whxxk\" (UniqueName: \"kubernetes.io/projected/7dbaab45-2adf-4e5c-b969-f8d3eb83ea37-kube-api-access-whxxk\") pod \"iptables-alerter-85tpl\" (UID: \"7dbaab45-2adf-4e5c-b969-f8d3eb83ea37\") " pod="openshift-network-operator/iptables-alerter-85tpl" Apr 22 19:23:30.328313 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328254 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.328313 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328278 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-env-overrides\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.328313 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328299 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b1c0b054-80a0-4cc5-b053-a4d99268aa8f-hosts-file\") pod \"node-resolver-cpzbb\" (UID: \"b1c0b054-80a0-4cc5-b053-a4d99268aa8f\") " pod="openshift-dns/node-resolver-cpzbb" Apr 22 19:23:30.328313 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328317 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-host-slash\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.328753 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328331 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-run-openvswitch\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.328753 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328345 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-node-log\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.328753 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328359 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-log-socket\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.328753 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328375 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk2m8\" (UniqueName: \"kubernetes.io/projected/b1c0b054-80a0-4cc5-b053-a4d99268aa8f-kube-api-access-wk2m8\") pod \"node-resolver-cpzbb\" (UID: \"b1c0b054-80a0-4cc5-b053-a4d99268aa8f\") " pod="openshift-dns/node-resolver-cpzbb" Apr 22 19:23:30.328753 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328388 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91147599-bdf0-49f5-98ed-a3567eaf56db-host\") pod \"node-ca-jkcf2\" (UID: \"91147599-bdf0-49f5-98ed-a3567eaf56db\") " pod="openshift-image-registry/node-ca-jkcf2" Apr 22 19:23:30.328753 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328404 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7dbaab45-2adf-4e5c-b969-f8d3eb83ea37-host-slash\") pod \"iptables-alerter-85tpl\" (UID: \"7dbaab45-2adf-4e5c-b969-f8d3eb83ea37\") " pod="openshift-network-operator/iptables-alerter-85tpl" Apr 22 19:23:30.328753 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328418 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-host-run-netns\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.328753 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328433 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-run-ovn\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.328753 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328459 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-host-run-ovn-kubernetes\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.328753 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328473 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-ovnkube-config\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.328753 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328488 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns8mp\" (UniqueName: \"kubernetes.io/projected/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-kube-api-access-ns8mp\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.328753 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328506 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-host-cni-bin\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.328753 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328520 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sxwq\" (UniqueName: \"kubernetes.io/projected/91147599-bdf0-49f5-98ed-a3567eaf56db-kube-api-access-7sxwq\") pod \"node-ca-jkcf2\" (UID: \"91147599-bdf0-49f5-98ed-a3567eaf56db\") " pod="openshift-image-registry/node-ca-jkcf2" Apr 22 19:23:30.328753 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328534 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-systemd-units\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.328753 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.328550 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-ovnkube-script-lib\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.330479 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.329962 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:30.330479 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.330036 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-b5jlt\"" Apr 22 19:23:30.330479 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.329968 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:30.330479 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.330359 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:23:30.330479 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.330397 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-gfcjx\"" Apr 22 19:23:30.331279 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.331264 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:23:30.331645 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.331625 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:23:30.331726 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.331648 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:23:30.331770 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.331756 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:23:30.331770 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.331762 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:23:30.333469 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.333447 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.335759 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.335739 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:23:30.335867 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.335772 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:23:30.335931 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.335876 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:23:30.335999 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.335983 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.336110 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.336095 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:23:30.336342 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.336325 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:23:30.336749 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.336730 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hkfdv\"" Apr 22 19:23:30.338469 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.338303 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:23:30.338469 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.338381 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:23:30.338636 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:30.338483 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bllx4" podUID="cbfd8869-819e-45c7-9536-08c72a48f2c3" Apr 22 19:23:30.338849 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.338835 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-26kft\"" Apr 22 19:23:30.339177 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.339158 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:30.339276 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:30.339229 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nndbq" podUID="1c461896-346c-4de1-9362-b9f83bd3486d" Apr 22 19:23:30.340700 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.340681 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-42mdb" Apr 22 19:23:30.343318 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.343197 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-rx9ps\"" Apr 22 19:23:30.343318 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.343220 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:23:30.343318 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.343214 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:23:30.363699 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.363666 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:18:29 +0000 UTC" deadline="2028-01-20 05:43:34.560930758 +0000 UTC" Apr 22 19:23:30.363699 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.363698 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15298h20m4.197236085s" Apr 22 19:23:30.420980 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.420950 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:23:30.428495 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.428443 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-145.ec2.internal" event={"ID":"b71282cca731aa5ddeb9357344f86ebf","Type":"ContainerStarted","Data":"7ccec5c6b6eb0b4cfd8dd829fb6840bcb335b9b261d576a99a76ff44d8b71b96"} Apr 22 19:23:30.428786 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.428751 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-run-openvswitch\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.428911 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.428790 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-node-log\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.428911 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.428834 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wk2m8\" (UniqueName: \"kubernetes.io/projected/b1c0b054-80a0-4cc5-b053-a4d99268aa8f-kube-api-access-wk2m8\") pod \"node-resolver-cpzbb\" (UID: \"b1c0b054-80a0-4cc5-b053-a4d99268aa8f\") " pod="openshift-dns/node-resolver-cpzbb" Apr 22 19:23:30.428911 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.428861 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7dbaab45-2adf-4e5c-b969-f8d3eb83ea37-host-slash\") pod \"iptables-alerter-85tpl\" (UID: \"7dbaab45-2adf-4e5c-b969-f8d3eb83ea37\") " pod="openshift-network-operator/iptables-alerter-85tpl" Apr 22 19:23:30.428911 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.428883 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-run-openvswitch\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.429102 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.428936 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7dbaab45-2adf-4e5c-b969-f8d3eb83ea37-host-slash\") pod \"iptables-alerter-85tpl\" (UID: \"7dbaab45-2adf-4e5c-b969-f8d3eb83ea37\") " pod="openshift-network-operator/iptables-alerter-85tpl" Apr 22 19:23:30.429102 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.428941 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-host-run-netns\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.429102 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.428885 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-host-run-netns\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.429102 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.428986 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-run-ovn\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.429102 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.428991 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-node-log\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.429102 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429016 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-host-run-netns\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.429102 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429031 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-run-ovn\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.429102 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429046 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b948da6e-8c3e-4892-92f1-4f59d7c5c885-system-cni-dir\") pod \"multus-additional-cni-plugins-dcctc\" (UID: \"b948da6e-8c3e-4892-92f1-4f59d7c5c885\") " pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.429458 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429115 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/159fb4c8-bbd5-4247-849c-f5639e9543f7-socket-dir\") pod \"aws-ebs-csi-driver-node-lm4wg\" (UID: \"159fb4c8-bbd5-4247-849c-f5639e9543f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" Apr 22 19:23:30.429458 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429150 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/159fb4c8-bbd5-4247-849c-f5639e9543f7-etc-selinux\") pod \"aws-ebs-csi-driver-node-lm4wg\" (UID: \"159fb4c8-bbd5-4247-849c-f5639e9543f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" Apr 22 19:23:30.429458 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429175 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b948da6e-8c3e-4892-92f1-4f59d7c5c885-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dcctc\" (UID: \"b948da6e-8c3e-4892-92f1-4f59d7c5c885\") " pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.429458 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429249 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-host-cni-bin\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.429458 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429285 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/32dd8967-d559-441e-95e3-6faf8bc49253-multus-daemon-config\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.429458 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429310 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/159fb4c8-bbd5-4247-849c-f5639e9543f7-sys-fs\") pod \"aws-ebs-csi-driver-node-lm4wg\" (UID: \"159fb4c8-bbd5-4247-849c-f5639e9543f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" Apr 22 19:23:30.429458 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429317 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-host-cni-bin\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.429458 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429335 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-etc-sysctl-conf\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.429458 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429360 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-lib-modules\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.429458 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429396 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7sxwq\" (UniqueName: \"kubernetes.io/projected/91147599-bdf0-49f5-98ed-a3567eaf56db-kube-api-access-7sxwq\") pod \"node-ca-jkcf2\" (UID: \"91147599-bdf0-49f5-98ed-a3567eaf56db\") " pod="openshift-image-registry/node-ca-jkcf2" Apr 22 19:23:30.429918 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429461 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-host-var-lib-cni-multus\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.429918 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429498 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-etc-kubernetes\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.429918 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429523 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-etc-kubernetes\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.429918 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429546 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-sys\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.429918 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429576 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-host\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.429918 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429603 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/91147599-bdf0-49f5-98ed-a3567eaf56db-serviceca\") pod \"node-ca-jkcf2\" (UID: \"91147599-bdf0-49f5-98ed-a3567eaf56db\") " pod="openshift-image-registry/node-ca-jkcf2" Apr 22 19:23:30.429918 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429630 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7dbaab45-2adf-4e5c-b969-f8d3eb83ea37-iptables-alerter-script\") pod \"iptables-alerter-85tpl\" (UID: \"7dbaab45-2adf-4e5c-b969-f8d3eb83ea37\") " pod="openshift-network-operator/iptables-alerter-85tpl" Apr 22 19:23:30.429918 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429656 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-multus-cni-dir\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.429918 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429678 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-os-release\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.429918 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429754 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-hostroot\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.429918 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429785 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-host-run-multus-certs\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.429918 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429829 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b948da6e-8c3e-4892-92f1-4f59d7c5c885-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dcctc\" (UID: \"b948da6e-8c3e-4892-92f1-4f59d7c5c885\") " pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.429918 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429856 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fqcc\" (UniqueName: \"kubernetes.io/projected/cbfd8869-819e-45c7-9536-08c72a48f2c3-kube-api-access-7fqcc\") pod \"network-check-target-bllx4\" (UID: \"cbfd8869-819e-45c7-9536-08c72a48f2c3\") " pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:23:30.429918 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429885 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-ovn-node-metrics-cert\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.429918 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429913 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/76e397fd-d6b3-4cfb-aa90-fb57dfa68ba4-konnectivity-ca\") pod \"konnectivity-agent-42mdb\" (UID: \"76e397fd-d6b3-4cfb-aa90-fb57dfa68ba4\") " pod="kube-system/konnectivity-agent-42mdb" Apr 22 19:23:30.430587 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.429953 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-var-lib-kubelet\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.430587 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430015 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-145.ec2.internal" event={"ID":"c78ed32340ce7d1e488ffab8bbfb0412","Type":"ContainerStarted","Data":"3b8b1af425586c89e2f350b89dcbcb19d36c81157c96eaef18b4b34c0ba12ca6"} Apr 22 19:23:30.430587 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430072 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fdac791-5aa3-4153-bb07-34cec3dbf296-tmp\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.430587 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430108 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b1c0b054-80a0-4cc5-b053-a4d99268aa8f-tmp-dir\") pod \"node-resolver-cpzbb\" (UID: \"b1c0b054-80a0-4cc5-b053-a4d99268aa8f\") " pod="openshift-dns/node-resolver-cpzbb" Apr 22 19:23:30.430587 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430124 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-var-lib-openvswitch\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.430587 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430176 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/91147599-bdf0-49f5-98ed-a3567eaf56db-serviceca\") pod \"node-ca-jkcf2\" (UID: \"91147599-bdf0-49f5-98ed-a3567eaf56db\") " pod="openshift-image-registry/node-ca-jkcf2" Apr 22 19:23:30.430587 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430268 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-var-lib-openvswitch\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.430587 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430264 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:23:30.430587 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430305 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-etc-openvswitch\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.430587 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430302 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7dbaab45-2adf-4e5c-b969-f8d3eb83ea37-iptables-alerter-script\") pod \"iptables-alerter-85tpl\" (UID: \"7dbaab45-2adf-4e5c-b969-f8d3eb83ea37\") " pod="openshift-network-operator/iptables-alerter-85tpl" Apr 22 19:23:30.430587 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430347 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-host-var-lib-kubelet\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.430587 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430385 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b948da6e-8c3e-4892-92f1-4f59d7c5c885-cni-binary-copy\") pod \"multus-additional-cni-plugins-dcctc\" (UID: \"b948da6e-8c3e-4892-92f1-4f59d7c5c885\") " pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.430587 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430390 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-etc-openvswitch\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.430587 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430402 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b1c0b054-80a0-4cc5-b053-a4d99268aa8f-tmp-dir\") pod \"node-resolver-cpzbb\" (UID: \"b1c0b054-80a0-4cc5-b053-a4d99268aa8f\") " pod="openshift-dns/node-resolver-cpzbb" Apr 22 19:23:30.430587 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430424 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/76e397fd-d6b3-4cfb-aa90-fb57dfa68ba4-agent-certs\") pod \"konnectivity-agent-42mdb\" (UID: \"76e397fd-d6b3-4cfb-aa90-fb57dfa68ba4\") " pod="kube-system/konnectivity-agent-42mdb" Apr 22 19:23:30.430587 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430465 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/159fb4c8-bbd5-4247-849c-f5639e9543f7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lm4wg\" (UID: \"159fb4c8-bbd5-4247-849c-f5639e9543f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" Apr 22 19:23:30.430587 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430494 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.431389 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430528 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.431389 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430561 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-multus-conf-dir\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.431389 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430582 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b948da6e-8c3e-4892-92f1-4f59d7c5c885-os-release\") pod \"multus-additional-cni-plugins-dcctc\" (UID: \"b948da6e-8c3e-4892-92f1-4f59d7c5c885\") " pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.431389 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430612 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jnww\" (UniqueName: \"kubernetes.io/projected/1c461896-346c-4de1-9362-b9f83bd3486d-kube-api-access-2jnww\") pod \"network-metrics-daemon-nndbq\" (UID: \"1c461896-346c-4de1-9362-b9f83bd3486d\") " pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:30.431389 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430633 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-etc-systemd\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.431389 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430659 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b1c0b054-80a0-4cc5-b053-a4d99268aa8f-hosts-file\") pod \"node-resolver-cpzbb\" (UID: \"b1c0b054-80a0-4cc5-b053-a4d99268aa8f\") " pod="openshift-dns/node-resolver-cpzbb" Apr 22 19:23:30.431389 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430707 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-host-slash\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.431389 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430732 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-log-socket\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.431389 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430758 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-host-var-lib-cni-bin\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.431389 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430785 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgwv5\" (UniqueName: \"kubernetes.io/projected/32dd8967-d559-441e-95e3-6faf8bc49253-kube-api-access-fgwv5\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.431389 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430792 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-host-slash\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.431389 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430822 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b948da6e-8c3e-4892-92f1-4f59d7c5c885-cnibin\") pod \"multus-additional-cni-plugins-dcctc\" (UID: \"b948da6e-8c3e-4892-92f1-4f59d7c5c885\") " pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.431389 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430781 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b1c0b054-80a0-4cc5-b053-a4d99268aa8f-hosts-file\") pod \"node-resolver-cpzbb\" (UID: \"b1c0b054-80a0-4cc5-b053-a4d99268aa8f\") " pod="openshift-dns/node-resolver-cpzbb" Apr 22 19:23:30.431389 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430855 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-log-socket\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.431389 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430892 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91147599-bdf0-49f5-98ed-a3567eaf56db-host\") pod \"node-ca-jkcf2\" (UID: \"91147599-bdf0-49f5-98ed-a3567eaf56db\") " pod="openshift-image-registry/node-ca-jkcf2" Apr 22 19:23:30.431389 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430933 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-host-run-ovn-kubernetes\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.431389 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.430979 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-ovnkube-config\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.432174 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431003 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91147599-bdf0-49f5-98ed-a3567eaf56db-host\") pod \"node-ca-jkcf2\" (UID: \"91147599-bdf0-49f5-98ed-a3567eaf56db\") " pod="openshift-image-registry/node-ca-jkcf2" Apr 22 19:23:30.432174 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431021 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ns8mp\" (UniqueName: \"kubernetes.io/projected/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-kube-api-access-ns8mp\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.432174 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431048 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-cnibin\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.432174 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431051 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-host-run-ovn-kubernetes\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.432174 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431074 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2rxg\" (UniqueName: \"kubernetes.io/projected/159fb4c8-bbd5-4247-849c-f5639e9543f7-kube-api-access-n2rxg\") pod \"aws-ebs-csi-driver-node-lm4wg\" (UID: \"159fb4c8-bbd5-4247-849c-f5639e9543f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" Apr 22 19:23:30.432174 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431118 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c461896-346c-4de1-9362-b9f83bd3486d-metrics-certs\") pod \"network-metrics-daemon-nndbq\" (UID: \"1c461896-346c-4de1-9362-b9f83bd3486d\") " pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:30.432174 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431142 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/159fb4c8-bbd5-4247-849c-f5639e9543f7-registration-dir\") pod \"aws-ebs-csi-driver-node-lm4wg\" (UID: \"159fb4c8-bbd5-4247-849c-f5639e9543f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" Apr 22 19:23:30.432174 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431186 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-systemd-units\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.432174 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431248 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-ovnkube-script-lib\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.432174 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431260 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-systemd-units\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.432174 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431299 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-etc-modprobe-d\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.432174 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431330 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-etc-sysconfig\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.432174 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431354 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-run\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.432174 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431436 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-run-systemd\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.432174 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431465 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-host-cni-netd\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.432174 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431494 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-system-cni-dir\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.432174 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431534 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/32dd8967-d559-441e-95e3-6faf8bc49253-cni-binary-copy\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.432875 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431550 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-ovnkube-config\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.432875 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431560 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-host-run-k8s-cni-cncf-io\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.432875 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431594 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b948da6e-8c3e-4892-92f1-4f59d7c5c885-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dcctc\" (UID: \"b948da6e-8c3e-4892-92f1-4f59d7c5c885\") " pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.432875 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431621 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpdxw\" (UniqueName: \"kubernetes.io/projected/b948da6e-8c3e-4892-92f1-4f59d7c5c885-kube-api-access-bpdxw\") pod \"multus-additional-cni-plugins-dcctc\" (UID: \"b948da6e-8c3e-4892-92f1-4f59d7c5c885\") " pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.432875 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431645 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0fdac791-5aa3-4153-bb07-34cec3dbf296-etc-tuned\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.432875 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431660 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-run-systemd\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.432875 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431683 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74sqt\" (UniqueName: \"kubernetes.io/projected/0fdac791-5aa3-4153-bb07-34cec3dbf296-kube-api-access-74sqt\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.432875 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431710 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-host-kubelet\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.432875 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431712 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-host-cni-netd\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.432875 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431741 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-multus-socket-dir-parent\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.432875 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431768 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whxxk\" (UniqueName: \"kubernetes.io/projected/7dbaab45-2adf-4e5c-b969-f8d3eb83ea37-kube-api-access-whxxk\") pod \"iptables-alerter-85tpl\" (UID: \"7dbaab45-2adf-4e5c-b969-f8d3eb83ea37\") " pod="openshift-network-operator/iptables-alerter-85tpl" Apr 22 19:23:30.432875 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431793 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-env-overrides\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.432875 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431871 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/159fb4c8-bbd5-4247-849c-f5639e9543f7-device-dir\") pod \"aws-ebs-csi-driver-node-lm4wg\" (UID: \"159fb4c8-bbd5-4247-849c-f5639e9543f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" Apr 22 19:23:30.432875 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431896 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-etc-sysctl-d\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.432875 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.431955 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-ovnkube-script-lib\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.432875 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.432026 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-host-kubelet\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.432875 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.432342 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-env-overrides\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.434115 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.434091 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-ovn-node-metrics-cert\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.438244 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.438215 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sxwq\" (UniqueName: \"kubernetes.io/projected/91147599-bdf0-49f5-98ed-a3567eaf56db-kube-api-access-7sxwq\") pod \"node-ca-jkcf2\" (UID: \"91147599-bdf0-49f5-98ed-a3567eaf56db\") " pod="openshift-image-registry/node-ca-jkcf2" Apr 22 19:23:30.439283 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.439260 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk2m8\" (UniqueName: \"kubernetes.io/projected/b1c0b054-80a0-4cc5-b053-a4d99268aa8f-kube-api-access-wk2m8\") pod \"node-resolver-cpzbb\" (UID: \"b1c0b054-80a0-4cc5-b053-a4d99268aa8f\") " pod="openshift-dns/node-resolver-cpzbb" Apr 22 19:23:30.439493 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.439458 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns8mp\" (UniqueName: \"kubernetes.io/projected/5879d8e5-623a-4ec2-9a22-b0b6c0c5917b-kube-api-access-ns8mp\") pod \"ovnkube-node-rsnsl\" (UID: \"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.443718 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.443688 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-whxxk\" (UniqueName: \"kubernetes.io/projected/7dbaab45-2adf-4e5c-b969-f8d3eb83ea37-kube-api-access-whxxk\") pod \"iptables-alerter-85tpl\" (UID: \"7dbaab45-2adf-4e5c-b969-f8d3eb83ea37\") " pod="openshift-network-operator/iptables-alerter-85tpl" Apr 22 19:23:30.532630 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.532584 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-multus-socket-dir-parent\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.532819 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.532639 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/159fb4c8-bbd5-4247-849c-f5639e9543f7-device-dir\") pod \"aws-ebs-csi-driver-node-lm4wg\" (UID: \"159fb4c8-bbd5-4247-849c-f5639e9543f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" Apr 22 19:23:30.532819 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.532663 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-etc-sysctl-d\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.532819 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.532690 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-host-run-netns\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.532819 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.532708 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-multus-socket-dir-parent\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.532819 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.532717 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b948da6e-8c3e-4892-92f1-4f59d7c5c885-system-cni-dir\") pod \"multus-additional-cni-plugins-dcctc\" (UID: \"b948da6e-8c3e-4892-92f1-4f59d7c5c885\") " pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.532819 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.532746 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/159fb4c8-bbd5-4247-849c-f5639e9543f7-socket-dir\") pod \"aws-ebs-csi-driver-node-lm4wg\" (UID: \"159fb4c8-bbd5-4247-849c-f5639e9543f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" Apr 22 19:23:30.532819 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.532770 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-host-run-netns\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.532819 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.532768 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/159fb4c8-bbd5-4247-849c-f5639e9543f7-device-dir\") pod \"aws-ebs-csi-driver-node-lm4wg\" (UID: \"159fb4c8-bbd5-4247-849c-f5639e9543f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" Apr 22 19:23:30.532819 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.532816 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/159fb4c8-bbd5-4247-849c-f5639e9543f7-etc-selinux\") pod \"aws-ebs-csi-driver-node-lm4wg\" (UID: \"159fb4c8-bbd5-4247-849c-f5639e9543f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" Apr 22 19:23:30.533269 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.532827 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-etc-sysctl-d\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.533269 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.532865 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b948da6e-8c3e-4892-92f1-4f59d7c5c885-system-cni-dir\") pod \"multus-additional-cni-plugins-dcctc\" (UID: \"b948da6e-8c3e-4892-92f1-4f59d7c5c885\") " pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.533269 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.532950 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/159fb4c8-bbd5-4247-849c-f5639e9543f7-socket-dir\") pod \"aws-ebs-csi-driver-node-lm4wg\" (UID: \"159fb4c8-bbd5-4247-849c-f5639e9543f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" Apr 22 19:23:30.533269 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.532942 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b948da6e-8c3e-4892-92f1-4f59d7c5c885-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dcctc\" (UID: \"b948da6e-8c3e-4892-92f1-4f59d7c5c885\") " pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.533269 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.532987 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/32dd8967-d559-441e-95e3-6faf8bc49253-multus-daemon-config\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.533269 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533012 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/159fb4c8-bbd5-4247-849c-f5639e9543f7-sys-fs\") pod \"aws-ebs-csi-driver-node-lm4wg\" (UID: \"159fb4c8-bbd5-4247-849c-f5639e9543f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" Apr 22 19:23:30.533269 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533019 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/159fb4c8-bbd5-4247-849c-f5639e9543f7-etc-selinux\") pod \"aws-ebs-csi-driver-node-lm4wg\" (UID: \"159fb4c8-bbd5-4247-849c-f5639e9543f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" Apr 22 19:23:30.533269 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533036 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-etc-sysctl-conf\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.533269 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533089 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-lib-modules\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.533269 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533126 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-host-var-lib-cni-multus\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.533269 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533149 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-etc-kubernetes\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.533269 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533172 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-etc-kubernetes\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.533269 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533174 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-etc-sysctl-conf\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.533269 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533206 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-sys\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.533269 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533221 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-etc-kubernetes\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.533269 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533231 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-host\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.533269 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533278 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-host\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.534149 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533315 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-lib-modules\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.534149 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533339 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-sys\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.534149 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533348 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-multus-cni-dir\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.534149 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533377 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-host-var-lib-cni-multus\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.534149 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533392 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-os-release\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.534149 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533415 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-etc-kubernetes\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.534149 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533421 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b948da6e-8c3e-4892-92f1-4f59d7c5c885-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dcctc\" (UID: \"b948da6e-8c3e-4892-92f1-4f59d7c5c885\") " pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.534149 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533417 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-hostroot\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.534149 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533450 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-hostroot\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.534149 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533471 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-host-run-multus-certs\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.534149 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533480 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-multus-cni-dir\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.534149 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533502 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b948da6e-8c3e-4892-92f1-4f59d7c5c885-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dcctc\" (UID: \"b948da6e-8c3e-4892-92f1-4f59d7c5c885\") " pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.534149 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533506 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-os-release\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.534149 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533508 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-host-run-multus-certs\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.534149 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533529 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fqcc\" (UniqueName: \"kubernetes.io/projected/cbfd8869-819e-45c7-9536-08c72a48f2c3-kube-api-access-7fqcc\") pod \"network-check-target-bllx4\" (UID: \"cbfd8869-819e-45c7-9536-08c72a48f2c3\") " pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:23:30.534149 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533567 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/76e397fd-d6b3-4cfb-aa90-fb57dfa68ba4-konnectivity-ca\") pod \"konnectivity-agent-42mdb\" (UID: \"76e397fd-d6b3-4cfb-aa90-fb57dfa68ba4\") " pod="kube-system/konnectivity-agent-42mdb" Apr 22 19:23:30.534149 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533587 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/32dd8967-d559-441e-95e3-6faf8bc49253-multus-daemon-config\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.534149 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533644 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/159fb4c8-bbd5-4247-849c-f5639e9543f7-sys-fs\") pod \"aws-ebs-csi-driver-node-lm4wg\" (UID: \"159fb4c8-bbd5-4247-849c-f5639e9543f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" Apr 22 19:23:30.535073 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533643 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b948da6e-8c3e-4892-92f1-4f59d7c5c885-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dcctc\" (UID: \"b948da6e-8c3e-4892-92f1-4f59d7c5c885\") " pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.535073 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533748 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-var-lib-kubelet\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.535073 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533778 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fdac791-5aa3-4153-bb07-34cec3dbf296-tmp\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.535073 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533821 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-host-var-lib-kubelet\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.535073 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533851 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b948da6e-8c3e-4892-92f1-4f59d7c5c885-cni-binary-copy\") pod \"multus-additional-cni-plugins-dcctc\" (UID: \"b948da6e-8c3e-4892-92f1-4f59d7c5c885\") " pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.535073 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533878 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/76e397fd-d6b3-4cfb-aa90-fb57dfa68ba4-agent-certs\") pod \"konnectivity-agent-42mdb\" (UID: \"76e397fd-d6b3-4cfb-aa90-fb57dfa68ba4\") " pod="kube-system/konnectivity-agent-42mdb" Apr 22 19:23:30.535073 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533906 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/159fb4c8-bbd5-4247-849c-f5639e9543f7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lm4wg\" (UID: \"159fb4c8-bbd5-4247-849c-f5639e9543f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" Apr 22 19:23:30.535073 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533914 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-host-var-lib-kubelet\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.535073 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533934 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-multus-conf-dir\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.535073 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533959 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b948da6e-8c3e-4892-92f1-4f59d7c5c885-os-release\") pod \"multus-additional-cni-plugins-dcctc\" (UID: \"b948da6e-8c3e-4892-92f1-4f59d7c5c885\") " pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.535073 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533985 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jnww\" (UniqueName: \"kubernetes.io/projected/1c461896-346c-4de1-9362-b9f83bd3486d-kube-api-access-2jnww\") pod \"network-metrics-daemon-nndbq\" (UID: \"1c461896-346c-4de1-9362-b9f83bd3486d\") " pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:30.535073 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.534009 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-etc-systemd\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.535073 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.534024 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/76e397fd-d6b3-4cfb-aa90-fb57dfa68ba4-konnectivity-ca\") pod \"konnectivity-agent-42mdb\" (UID: \"76e397fd-d6b3-4cfb-aa90-fb57dfa68ba4\") " pod="kube-system/konnectivity-agent-42mdb" Apr 22 19:23:30.535073 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.534037 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-host-var-lib-cni-bin\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.535073 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.534061 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgwv5\" (UniqueName: \"kubernetes.io/projected/32dd8967-d559-441e-95e3-6faf8bc49253-kube-api-access-fgwv5\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.535073 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.534086 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/159fb4c8-bbd5-4247-849c-f5639e9543f7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lm4wg\" (UID: \"159fb4c8-bbd5-4247-849c-f5639e9543f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" Apr 22 19:23:30.535073 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.534100 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b948da6e-8c3e-4892-92f1-4f59d7c5c885-cnibin\") pod \"multus-additional-cni-plugins-dcctc\" (UID: \"b948da6e-8c3e-4892-92f1-4f59d7c5c885\") " pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.535948 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.534129 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-cnibin\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.535948 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.534160 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2rxg\" (UniqueName: \"kubernetes.io/projected/159fb4c8-bbd5-4247-849c-f5639e9543f7-kube-api-access-n2rxg\") pod \"aws-ebs-csi-driver-node-lm4wg\" (UID: \"159fb4c8-bbd5-4247-849c-f5639e9543f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" Apr 22 19:23:30.535948 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.534188 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c461896-346c-4de1-9362-b9f83bd3486d-metrics-certs\") pod \"network-metrics-daemon-nndbq\" (UID: \"1c461896-346c-4de1-9362-b9f83bd3486d\") " pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:30.535948 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.534212 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/159fb4c8-bbd5-4247-849c-f5639e9543f7-registration-dir\") pod \"aws-ebs-csi-driver-node-lm4wg\" (UID: \"159fb4c8-bbd5-4247-849c-f5639e9543f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" Apr 22 19:23:30.535948 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.534241 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-host-var-lib-cni-bin\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.535948 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.534254 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-etc-modprobe-d\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.535948 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.534280 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-etc-sysconfig\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.535948 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.534294 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-multus-conf-dir\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.535948 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.534304 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-run\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.535948 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.534333 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-system-cni-dir\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.535948 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.534356 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/32dd8967-d559-441e-95e3-6faf8bc49253-cni-binary-copy\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.535948 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.534358 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b948da6e-8c3e-4892-92f1-4f59d7c5c885-os-release\") pod \"multus-additional-cni-plugins-dcctc\" (UID: \"b948da6e-8c3e-4892-92f1-4f59d7c5c885\") " pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.535948 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.534392 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-host-run-k8s-cni-cncf-io\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.535948 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.534424 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b948da6e-8c3e-4892-92f1-4f59d7c5c885-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dcctc\" (UID: \"b948da6e-8c3e-4892-92f1-4f59d7c5c885\") " pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.535948 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.534446 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpdxw\" (UniqueName: \"kubernetes.io/projected/b948da6e-8c3e-4892-92f1-4f59d7c5c885-kube-api-access-bpdxw\") pod \"multus-additional-cni-plugins-dcctc\" (UID: \"b948da6e-8c3e-4892-92f1-4f59d7c5c885\") " pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.535948 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.534470 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0fdac791-5aa3-4153-bb07-34cec3dbf296-etc-tuned\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.535948 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.534492 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74sqt\" (UniqueName: \"kubernetes.io/projected/0fdac791-5aa3-4153-bb07-34cec3dbf296-kube-api-access-74sqt\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.536831 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.534779 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b948da6e-8c3e-4892-92f1-4f59d7c5c885-cni-binary-copy\") pod \"multus-additional-cni-plugins-dcctc\" (UID: \"b948da6e-8c3e-4892-92f1-4f59d7c5c885\") " pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.536831 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.535151 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/32dd8967-d559-441e-95e3-6faf8bc49253-cni-binary-copy\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.536831 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.535612 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-etc-systemd\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.536831 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.535686 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-host-run-k8s-cni-cncf-io\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.536831 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:30.535752 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:30.536831 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:30.535854 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c461896-346c-4de1-9362-b9f83bd3486d-metrics-certs podName:1c461896-346c-4de1-9362-b9f83bd3486d nodeName:}" failed. No retries permitted until 2026-04-22 19:23:31.035832828 +0000 UTC m=+3.155212135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c461896-346c-4de1-9362-b9f83bd3486d-metrics-certs") pod "network-metrics-daemon-nndbq" (UID: "1c461896-346c-4de1-9362-b9f83bd3486d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:30.536831 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.536203 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b948da6e-8c3e-4892-92f1-4f59d7c5c885-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dcctc\" (UID: \"b948da6e-8c3e-4892-92f1-4f59d7c5c885\") " pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.536831 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.533881 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-var-lib-kubelet\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.536831 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.536274 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b948da6e-8c3e-4892-92f1-4f59d7c5c885-cnibin\") pod \"multus-additional-cni-plugins-dcctc\" (UID: \"b948da6e-8c3e-4892-92f1-4f59d7c5c885\") " pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.536831 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.536350 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-cnibin\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.536831 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.536454 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/159fb4c8-bbd5-4247-849c-f5639e9543f7-registration-dir\") pod \"aws-ebs-csi-driver-node-lm4wg\" (UID: \"159fb4c8-bbd5-4247-849c-f5639e9543f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" Apr 22 19:23:30.536831 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.536461 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-etc-sysconfig\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.536831 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.536609 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-etc-modprobe-d\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.536831 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.536677 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0fdac791-5aa3-4153-bb07-34cec3dbf296-run\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.536831 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.536755 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32dd8967-d559-441e-95e3-6faf8bc49253-system-cni-dir\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.537642 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.536929 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fdac791-5aa3-4153-bb07-34cec3dbf296-tmp\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.539482 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.539453 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0fdac791-5aa3-4153-bb07-34cec3dbf296-etc-tuned\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.540258 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:30.540237 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:30.540258 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:30.540256 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:30.540408 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:30.540266 2574 projected.go:194] Error preparing data for projected volume kube-api-access-7fqcc for pod openshift-network-diagnostics/network-check-target-bllx4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:30.540408 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:30.540320 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cbfd8869-819e-45c7-9536-08c72a48f2c3-kube-api-access-7fqcc podName:cbfd8869-819e-45c7-9536-08c72a48f2c3 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:31.040307205 +0000 UTC m=+3.159686513 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7fqcc" (UniqueName: "kubernetes.io/projected/cbfd8869-819e-45c7-9536-08c72a48f2c3-kube-api-access-7fqcc") pod "network-check-target-bllx4" (UID: "cbfd8869-819e-45c7-9536-08c72a48f2c3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:30.540505 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.540482 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/76e397fd-d6b3-4cfb-aa90-fb57dfa68ba4-agent-certs\") pod \"konnectivity-agent-42mdb\" (UID: \"76e397fd-d6b3-4cfb-aa90-fb57dfa68ba4\") " pod="kube-system/konnectivity-agent-42mdb" Apr 22 19:23:30.542740 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.542717 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jnww\" (UniqueName: \"kubernetes.io/projected/1c461896-346c-4de1-9362-b9f83bd3486d-kube-api-access-2jnww\") pod \"network-metrics-daemon-nndbq\" (UID: \"1c461896-346c-4de1-9362-b9f83bd3486d\") " pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:30.544530 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.544504 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpdxw\" (UniqueName: \"kubernetes.io/projected/b948da6e-8c3e-4892-92f1-4f59d7c5c885-kube-api-access-bpdxw\") pod \"multus-additional-cni-plugins-dcctc\" (UID: \"b948da6e-8c3e-4892-92f1-4f59d7c5c885\") " pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.545142 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.545122 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74sqt\" (UniqueName: \"kubernetes.io/projected/0fdac791-5aa3-4153-bb07-34cec3dbf296-kube-api-access-74sqt\") pod \"tuned-7kfks\" (UID: \"0fdac791-5aa3-4153-bb07-34cec3dbf296\") " pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.545598 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.545581 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2rxg\" (UniqueName: \"kubernetes.io/projected/159fb4c8-bbd5-4247-849c-f5639e9543f7-kube-api-access-n2rxg\") pod \"aws-ebs-csi-driver-node-lm4wg\" (UID: \"159fb4c8-bbd5-4247-849c-f5639e9543f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" Apr 22 19:23:30.545937 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.545917 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgwv5\" (UniqueName: \"kubernetes.io/projected/32dd8967-d559-441e-95e3-6faf8bc49253-kube-api-access-fgwv5\") pod \"multus-8z9zt\" (UID: \"32dd8967-d559-441e-95e3-6faf8bc49253\") " pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.594278 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.594165 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:30.630673 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.630625 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cpzbb" Apr 22 19:23:30.637454 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.637427 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jkcf2" Apr 22 19:23:30.649031 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.649008 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-85tpl" Apr 22 19:23:30.654607 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.654582 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" Apr 22 19:23:30.662276 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.662245 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7kfks" Apr 22 19:23:30.669984 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.669958 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:30.676616 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.676596 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dcctc" Apr 22 19:23:30.686234 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.686208 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8z9zt" Apr 22 19:23:30.692814 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:30.692788 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-42mdb" Apr 22 19:23:31.038171 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:31.038077 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c461896-346c-4de1-9362-b9f83bd3486d-metrics-certs\") pod \"network-metrics-daemon-nndbq\" (UID: \"1c461896-346c-4de1-9362-b9f83bd3486d\") " pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:31.038334 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:31.038231 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:31.038334 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:31.038297 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c461896-346c-4de1-9362-b9f83bd3486d-metrics-certs podName:1c461896-346c-4de1-9362-b9f83bd3486d nodeName:}" failed. No retries permitted until 2026-04-22 19:23:32.038280411 +0000 UTC m=+4.157659700 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c461896-346c-4de1-9362-b9f83bd3486d-metrics-certs") pod "network-metrics-daemon-nndbq" (UID: "1c461896-346c-4de1-9362-b9f83bd3486d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:31.108845 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:31.108818 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod159fb4c8_bbd5_4247_849c_f5639e9543f7.slice/crio-58c37eaee66747fc01e7e977d092de17f72a51ccb308016fab9d692f3ccd160d WatchSource:0}: Error finding container 58c37eaee66747fc01e7e977d092de17f72a51ccb308016fab9d692f3ccd160d: Status 404 returned error can't find the container with id 58c37eaee66747fc01e7e977d092de17f72a51ccb308016fab9d692f3ccd160d Apr 22 19:23:31.110158 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:31.110131 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5879d8e5_623a_4ec2_9a22_b0b6c0c5917b.slice/crio-39fbeb0448ae5a07141cfb645618a077085cfe796bf57d63204d73f2926108dd WatchSource:0}: Error finding container 39fbeb0448ae5a07141cfb645618a077085cfe796bf57d63204d73f2926108dd: Status 404 returned error can't find the container with id 39fbeb0448ae5a07141cfb645618a077085cfe796bf57d63204d73f2926108dd Apr 22 19:23:31.112146 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:31.112028 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76e397fd_d6b3_4cfb_aa90_fb57dfa68ba4.slice/crio-b8359e6af87e4a5ca3883533a3607abfb86224d3bd02a75ab2d6283e863aaa46 WatchSource:0}: Error finding container b8359e6af87e4a5ca3883533a3607abfb86224d3bd02a75ab2d6283e863aaa46: Status 404 returned error can't find the container with id b8359e6af87e4a5ca3883533a3607abfb86224d3bd02a75ab2d6283e863aaa46 Apr 22 19:23:31.116836 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:31.116324 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1c0b054_80a0_4cc5_b053_a4d99268aa8f.slice/crio-8db3923486c6da631fa20fc9c8db5ac241af587121525c4dff8f4f6dccecd5d4 WatchSource:0}: Error finding container 8db3923486c6da631fa20fc9c8db5ac241af587121525c4dff8f4f6dccecd5d4: Status 404 returned error can't find the container with id 8db3923486c6da631fa20fc9c8db5ac241af587121525c4dff8f4f6dccecd5d4 Apr 22 19:23:31.117823 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:31.117774 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32dd8967_d559_441e_95e3_6faf8bc49253.slice/crio-354ff4bebc013f45ab8edc82ea3a945c95490d508551d68cd607144226a38d8f WatchSource:0}: Error finding container 354ff4bebc013f45ab8edc82ea3a945c95490d508551d68cd607144226a38d8f: Status 404 returned error can't find the container with id 354ff4bebc013f45ab8edc82ea3a945c95490d508551d68cd607144226a38d8f Apr 22 19:23:31.118962 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:31.118931 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dbaab45_2adf_4e5c_b969_f8d3eb83ea37.slice/crio-63940c8963e521d77d11b2b19abb683ddaae7660194ee5450b4370a71b3f736f WatchSource:0}: Error finding container 63940c8963e521d77d11b2b19abb683ddaae7660194ee5450b4370a71b3f736f: Status 404 returned error can't find the container with id 63940c8963e521d77d11b2b19abb683ddaae7660194ee5450b4370a71b3f736f Apr 22 19:23:31.119496 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:31.119396 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91147599_bdf0_49f5_98ed_a3567eaf56db.slice/crio-7eda502bd63e398a2a0ded47061472541a079757ca1957b793276d947b2ba7a9 WatchSource:0}: Error finding container 7eda502bd63e398a2a0ded47061472541a079757ca1957b793276d947b2ba7a9: Status 404 returned error can't find the container with id 7eda502bd63e398a2a0ded47061472541a079757ca1957b793276d947b2ba7a9 Apr 22 19:23:31.120411 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:31.120284 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb948da6e_8c3e_4892_92f1_4f59d7c5c885.slice/crio-6f6e6081076005588ca94018f1b4702d49568af4d9f271bf3768dde56bac716e WatchSource:0}: Error finding container 6f6e6081076005588ca94018f1b4702d49568af4d9f271bf3768dde56bac716e: Status 404 returned error can't find the container with id 6f6e6081076005588ca94018f1b4702d49568af4d9f271bf3768dde56bac716e Apr 22 19:23:31.121156 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:23:31.121084 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fdac791_5aa3_4153_bb07_34cec3dbf296.slice/crio-1623a766be6adb221bb9b05275bf8868dc66430dccac5efbe2360aabc91dea40 WatchSource:0}: Error finding container 1623a766be6adb221bb9b05275bf8868dc66430dccac5efbe2360aabc91dea40: Status 404 returned error can't find the container with id 1623a766be6adb221bb9b05275bf8868dc66430dccac5efbe2360aabc91dea40 Apr 22 19:23:31.138743 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:31.138680 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fqcc\" (UniqueName: \"kubernetes.io/projected/cbfd8869-819e-45c7-9536-08c72a48f2c3-kube-api-access-7fqcc\") pod \"network-check-target-bllx4\" (UID: \"cbfd8869-819e-45c7-9536-08c72a48f2c3\") " pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:23:31.138890 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:31.138873 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:31.138954 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:31.138897 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:31.138954 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:31.138908 2574 projected.go:194] Error preparing data for projected volume kube-api-access-7fqcc for pod openshift-network-diagnostics/network-check-target-bllx4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:31.139054 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:31.138957 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cbfd8869-819e-45c7-9536-08c72a48f2c3-kube-api-access-7fqcc podName:cbfd8869-819e-45c7-9536-08c72a48f2c3 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:32.138939775 +0000 UTC m=+4.258319061 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7fqcc" (UniqueName: "kubernetes.io/projected/cbfd8869-819e-45c7-9536-08c72a48f2c3-kube-api-access-7fqcc") pod "network-check-target-bllx4" (UID: "cbfd8869-819e-45c7-9536-08c72a48f2c3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:31.364945 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:31.364863 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:18:29 +0000 UTC" deadline="2027-12-18 01:03:59.676470197 +0000 UTC" Apr 22 19:23:31.364945 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:31.364895 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14501h40m28.311578256s" Apr 22 19:23:31.432442 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:31.432389 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcctc" event={"ID":"b948da6e-8c3e-4892-92f1-4f59d7c5c885","Type":"ContainerStarted","Data":"6f6e6081076005588ca94018f1b4702d49568af4d9f271bf3768dde56bac716e"} Apr 22 19:23:31.435164 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:31.435119 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-85tpl" event={"ID":"7dbaab45-2adf-4e5c-b969-f8d3eb83ea37","Type":"ContainerStarted","Data":"63940c8963e521d77d11b2b19abb683ddaae7660194ee5450b4370a71b3f736f"} Apr 22 19:23:31.436533 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:31.436508 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jkcf2" event={"ID":"91147599-bdf0-49f5-98ed-a3567eaf56db","Type":"ContainerStarted","Data":"7eda502bd63e398a2a0ded47061472541a079757ca1957b793276d947b2ba7a9"} Apr 22 19:23:31.437428 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:31.437405 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" event={"ID":"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b","Type":"ContainerStarted","Data":"39fbeb0448ae5a07141cfb645618a077085cfe796bf57d63204d73f2926108dd"} Apr 22 19:23:31.438437 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:31.438408 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" event={"ID":"159fb4c8-bbd5-4247-849c-f5639e9543f7","Type":"ContainerStarted","Data":"58c37eaee66747fc01e7e977d092de17f72a51ccb308016fab9d692f3ccd160d"} Apr 22 19:23:31.440002 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:31.439972 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-145.ec2.internal" event={"ID":"b71282cca731aa5ddeb9357344f86ebf","Type":"ContainerStarted","Data":"2018080465fe8516a1bf62b01b8143f6a307406cdc686ee7251801f27a90553f"} Apr 22 19:23:31.444432 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:31.444399 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7kfks" event={"ID":"0fdac791-5aa3-4153-bb07-34cec3dbf296","Type":"ContainerStarted","Data":"1623a766be6adb221bb9b05275bf8868dc66430dccac5efbe2360aabc91dea40"} Apr 22 19:23:31.445480 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:31.445455 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cpzbb" event={"ID":"b1c0b054-80a0-4cc5-b053-a4d99268aa8f","Type":"ContainerStarted","Data":"8db3923486c6da631fa20fc9c8db5ac241af587121525c4dff8f4f6dccecd5d4"} Apr 22 19:23:31.446560 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:31.446539 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:31.446772 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:31.446736 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-42mdb" event={"ID":"76e397fd-d6b3-4cfb-aa90-fb57dfa68ba4","Type":"ContainerStarted","Data":"b8359e6af87e4a5ca3883533a3607abfb86224d3bd02a75ab2d6283e863aaa46"} Apr 22 19:23:31.448571 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:31.448518 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8z9zt" event={"ID":"32dd8967-d559-441e-95e3-6faf8bc49253","Type":"ContainerStarted","Data":"354ff4bebc013f45ab8edc82ea3a945c95490d508551d68cd607144226a38d8f"} Apr 22 19:23:31.452036 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:31.451986 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-145.ec2.internal" podStartSLOduration=2.4519717180000002 podStartE2EDuration="2.451971718s" podCreationTimestamp="2026-04-22 19:23:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:23:31.451950632 +0000 UTC m=+3.571329942" watchObservedRunningTime="2026-04-22 19:23:31.451971718 +0000 UTC m=+3.571351021" Apr 22 19:23:32.045577 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:32.045544 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c461896-346c-4de1-9362-b9f83bd3486d-metrics-certs\") pod \"network-metrics-daemon-nndbq\" (UID: \"1c461896-346c-4de1-9362-b9f83bd3486d\") " pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:32.045710 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:32.045701 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:32.045773 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:32.045764 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c461896-346c-4de1-9362-b9f83bd3486d-metrics-certs podName:1c461896-346c-4de1-9362-b9f83bd3486d nodeName:}" failed. No retries permitted until 2026-04-22 19:23:34.04574429 +0000 UTC m=+6.165123577 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c461896-346c-4de1-9362-b9f83bd3486d-metrics-certs") pod "network-metrics-daemon-nndbq" (UID: "1c461896-346c-4de1-9362-b9f83bd3486d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:32.147142 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:32.146480 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fqcc\" (UniqueName: \"kubernetes.io/projected/cbfd8869-819e-45c7-9536-08c72a48f2c3-kube-api-access-7fqcc\") pod \"network-check-target-bllx4\" (UID: \"cbfd8869-819e-45c7-9536-08c72a48f2c3\") " pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:23:32.147142 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:32.146666 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:32.147142 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:32.146684 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:32.147142 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:32.146697 2574 projected.go:194] Error preparing data for projected volume kube-api-access-7fqcc for pod openshift-network-diagnostics/network-check-target-bllx4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:32.147142 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:32.146762 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cbfd8869-819e-45c7-9536-08c72a48f2c3-kube-api-access-7fqcc podName:cbfd8869-819e-45c7-9536-08c72a48f2c3 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:34.146734937 +0000 UTC m=+6.266114237 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7fqcc" (UniqueName: "kubernetes.io/projected/cbfd8869-819e-45c7-9536-08c72a48f2c3-kube-api-access-7fqcc") pod "network-check-target-bllx4" (UID: "cbfd8869-819e-45c7-9536-08c72a48f2c3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:32.427331 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:32.427228 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:32.427814 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:32.427372 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nndbq" podUID="1c461896-346c-4de1-9362-b9f83bd3486d" Apr 22 19:23:32.427814 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:32.427761 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:23:32.427933 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:32.427853 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bllx4" podUID="cbfd8869-819e-45c7-9536-08c72a48f2c3" Apr 22 19:23:32.458778 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:32.458742 2574 generic.go:358] "Generic (PLEG): container finished" podID="c78ed32340ce7d1e488ffab8bbfb0412" containerID="bda927dbd6a90b3351bcfbc08ff1b5a7270be4d70a2407fb5badc85ace477479" exitCode=0 Apr 22 19:23:32.459149 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:32.458920 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-145.ec2.internal" event={"ID":"c78ed32340ce7d1e488ffab8bbfb0412","Type":"ContainerDied","Data":"bda927dbd6a90b3351bcfbc08ff1b5a7270be4d70a2407fb5badc85ace477479"} Apr 22 19:23:33.465341 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:33.465265 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-145.ec2.internal" event={"ID":"c78ed32340ce7d1e488ffab8bbfb0412","Type":"ContainerStarted","Data":"fad6bf25aa0fff51707b122854b70bdfdc6c99c1ec2947e2d82f70dc9009a419"} Apr 22 19:23:33.481469 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:33.481416 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-145.ec2.internal" podStartSLOduration=4.48139738 podStartE2EDuration="4.48139738s" podCreationTimestamp="2026-04-22 19:23:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:23:33.479750965 +0000 UTC m=+5.599130277" watchObservedRunningTime="2026-04-22 19:23:33.48139738 +0000 UTC m=+5.600776690" Apr 22 19:23:34.064416 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:34.064375 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c461896-346c-4de1-9362-b9f83bd3486d-metrics-certs\") pod \"network-metrics-daemon-nndbq\" (UID: \"1c461896-346c-4de1-9362-b9f83bd3486d\") " pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:34.064616 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:34.064540 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:34.064680 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:34.064622 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c461896-346c-4de1-9362-b9f83bd3486d-metrics-certs podName:1c461896-346c-4de1-9362-b9f83bd3486d nodeName:}" failed. No retries permitted until 2026-04-22 19:23:38.064601639 +0000 UTC m=+10.183980932 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c461896-346c-4de1-9362-b9f83bd3486d-metrics-certs") pod "network-metrics-daemon-nndbq" (UID: "1c461896-346c-4de1-9362-b9f83bd3486d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:34.165003 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:34.164969 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fqcc\" (UniqueName: \"kubernetes.io/projected/cbfd8869-819e-45c7-9536-08c72a48f2c3-kube-api-access-7fqcc\") pod \"network-check-target-bllx4\" (UID: \"cbfd8869-819e-45c7-9536-08c72a48f2c3\") " pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:23:34.165209 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:34.165190 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:34.165278 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:34.165215 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:34.165278 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:34.165231 2574 projected.go:194] Error preparing data for projected volume kube-api-access-7fqcc for pod openshift-network-diagnostics/network-check-target-bllx4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:34.165374 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:34.165294 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cbfd8869-819e-45c7-9536-08c72a48f2c3-kube-api-access-7fqcc podName:cbfd8869-819e-45c7-9536-08c72a48f2c3 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:38.165279349 +0000 UTC m=+10.284658637 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7fqcc" (UniqueName: "kubernetes.io/projected/cbfd8869-819e-45c7-9536-08c72a48f2c3-kube-api-access-7fqcc") pod "network-check-target-bllx4" (UID: "cbfd8869-819e-45c7-9536-08c72a48f2c3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:34.426314 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:34.426233 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:34.426485 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:34.426381 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nndbq" podUID="1c461896-346c-4de1-9362-b9f83bd3486d" Apr 22 19:23:34.426485 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:34.426474 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:23:34.426596 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:34.426574 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bllx4" podUID="cbfd8869-819e-45c7-9536-08c72a48f2c3" Apr 22 19:23:36.425019 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:36.424977 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:36.425521 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:36.425137 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nndbq" podUID="1c461896-346c-4de1-9362-b9f83bd3486d" Apr 22 19:23:36.425521 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:36.425511 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:23:36.425618 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:36.425597 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bllx4" podUID="cbfd8869-819e-45c7-9536-08c72a48f2c3" Apr 22 19:23:38.098092 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:38.097729 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c461896-346c-4de1-9362-b9f83bd3486d-metrics-certs\") pod \"network-metrics-daemon-nndbq\" (UID: \"1c461896-346c-4de1-9362-b9f83bd3486d\") " pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:38.098092 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:38.097888 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:38.098092 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:38.097948 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c461896-346c-4de1-9362-b9f83bd3486d-metrics-certs podName:1c461896-346c-4de1-9362-b9f83bd3486d nodeName:}" failed. No retries permitted until 2026-04-22 19:23:46.097929459 +0000 UTC m=+18.217308751 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c461896-346c-4de1-9362-b9f83bd3486d-metrics-certs") pod "network-metrics-daemon-nndbq" (UID: "1c461896-346c-4de1-9362-b9f83bd3486d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:38.199132 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:38.198935 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fqcc\" (UniqueName: \"kubernetes.io/projected/cbfd8869-819e-45c7-9536-08c72a48f2c3-kube-api-access-7fqcc\") pod \"network-check-target-bllx4\" (UID: \"cbfd8869-819e-45c7-9536-08c72a48f2c3\") " pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:23:38.199132 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:38.199125 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:38.199132 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:38.199143 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:38.199413 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:38.199155 2574 projected.go:194] Error preparing data for projected volume kube-api-access-7fqcc for pod openshift-network-diagnostics/network-check-target-bllx4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:38.199413 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:38.199211 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cbfd8869-819e-45c7-9536-08c72a48f2c3-kube-api-access-7fqcc podName:cbfd8869-819e-45c7-9536-08c72a48f2c3 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:46.19919546 +0000 UTC m=+18.318574761 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7fqcc" (UniqueName: "kubernetes.io/projected/cbfd8869-819e-45c7-9536-08c72a48f2c3-kube-api-access-7fqcc") pod "network-check-target-bllx4" (UID: "cbfd8869-819e-45c7-9536-08c72a48f2c3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:38.426686 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:38.425952 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:38.426686 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:38.426069 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nndbq" podUID="1c461896-346c-4de1-9362-b9f83bd3486d" Apr 22 19:23:38.426686 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:38.426502 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:23:38.426686 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:38.426594 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bllx4" podUID="cbfd8869-819e-45c7-9536-08c72a48f2c3" Apr 22 19:23:40.425259 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:40.425180 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:40.425259 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:40.425235 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:23:40.425736 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:40.425320 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nndbq" podUID="1c461896-346c-4de1-9362-b9f83bd3486d" Apr 22 19:23:40.425736 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:40.425455 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bllx4" podUID="cbfd8869-819e-45c7-9536-08c72a48f2c3" Apr 22 19:23:42.425081 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:42.425051 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:42.425496 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:42.425095 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:23:42.425496 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:42.425188 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nndbq" podUID="1c461896-346c-4de1-9362-b9f83bd3486d" Apr 22 19:23:42.425496 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:42.425286 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bllx4" podUID="cbfd8869-819e-45c7-9536-08c72a48f2c3" Apr 22 19:23:42.549064 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:42.549027 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-k48jx"] Apr 22 19:23:42.551674 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:42.551648 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:23:42.551777 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:42.551712 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k48jx" podUID="793d5ba1-c977-4404-bd38-8b78e8e5e191" Apr 22 19:23:42.633883 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:42.633845 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/793d5ba1-c977-4404-bd38-8b78e8e5e191-original-pull-secret\") pod \"global-pull-secret-syncer-k48jx\" (UID: \"793d5ba1-c977-4404-bd38-8b78e8e5e191\") " pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:23:42.634036 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:42.633965 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/793d5ba1-c977-4404-bd38-8b78e8e5e191-kubelet-config\") pod \"global-pull-secret-syncer-k48jx\" (UID: \"793d5ba1-c977-4404-bd38-8b78e8e5e191\") " pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:23:42.634036 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:42.634008 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/793d5ba1-c977-4404-bd38-8b78e8e5e191-dbus\") pod \"global-pull-secret-syncer-k48jx\" (UID: \"793d5ba1-c977-4404-bd38-8b78e8e5e191\") " pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:23:42.735191 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:42.735110 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/793d5ba1-c977-4404-bd38-8b78e8e5e191-kubelet-config\") pod \"global-pull-secret-syncer-k48jx\" (UID: \"793d5ba1-c977-4404-bd38-8b78e8e5e191\") " pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:23:42.735191 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:42.735151 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/793d5ba1-c977-4404-bd38-8b78e8e5e191-dbus\") pod \"global-pull-secret-syncer-k48jx\" (UID: \"793d5ba1-c977-4404-bd38-8b78e8e5e191\") " pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:23:42.735412 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:42.735235 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/793d5ba1-c977-4404-bd38-8b78e8e5e191-kubelet-config\") pod \"global-pull-secret-syncer-k48jx\" (UID: \"793d5ba1-c977-4404-bd38-8b78e8e5e191\") " pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:23:42.735412 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:42.735318 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/793d5ba1-c977-4404-bd38-8b78e8e5e191-original-pull-secret\") pod \"global-pull-secret-syncer-k48jx\" (UID: \"793d5ba1-c977-4404-bd38-8b78e8e5e191\") " pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:23:42.735412 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:42.735382 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/793d5ba1-c977-4404-bd38-8b78e8e5e191-dbus\") pod \"global-pull-secret-syncer-k48jx\" (UID: \"793d5ba1-c977-4404-bd38-8b78e8e5e191\") " pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:23:42.735555 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:42.735432 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:42.735555 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:42.735495 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/793d5ba1-c977-4404-bd38-8b78e8e5e191-original-pull-secret podName:793d5ba1-c977-4404-bd38-8b78e8e5e191 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:43.235476366 +0000 UTC m=+15.354855666 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/793d5ba1-c977-4404-bd38-8b78e8e5e191-original-pull-secret") pod "global-pull-secret-syncer-k48jx" (UID: "793d5ba1-c977-4404-bd38-8b78e8e5e191") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:43.238180 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:43.238146 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/793d5ba1-c977-4404-bd38-8b78e8e5e191-original-pull-secret\") pod \"global-pull-secret-syncer-k48jx\" (UID: \"793d5ba1-c977-4404-bd38-8b78e8e5e191\") " pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:23:43.238340 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:43.238309 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:43.238384 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:43.238379 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/793d5ba1-c977-4404-bd38-8b78e8e5e191-original-pull-secret podName:793d5ba1-c977-4404-bd38-8b78e8e5e191 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:44.238359278 +0000 UTC m=+16.357738575 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/793d5ba1-c977-4404-bd38-8b78e8e5e191-original-pull-secret") pod "global-pull-secret-syncer-k48jx" (UID: "793d5ba1-c977-4404-bd38-8b78e8e5e191") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:44.246625 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:44.246586 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/793d5ba1-c977-4404-bd38-8b78e8e5e191-original-pull-secret\") pod \"global-pull-secret-syncer-k48jx\" (UID: \"793d5ba1-c977-4404-bd38-8b78e8e5e191\") " pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:23:44.247074 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:44.246748 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:44.247074 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:44.246838 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/793d5ba1-c977-4404-bd38-8b78e8e5e191-original-pull-secret podName:793d5ba1-c977-4404-bd38-8b78e8e5e191 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:46.246817127 +0000 UTC m=+18.366196430 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/793d5ba1-c977-4404-bd38-8b78e8e5e191-original-pull-secret") pod "global-pull-secret-syncer-k48jx" (UID: "793d5ba1-c977-4404-bd38-8b78e8e5e191") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:44.425233 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:44.425199 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:23:44.425391 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:44.425198 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:23:44.425391 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:44.425335 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bllx4" podUID="cbfd8869-819e-45c7-9536-08c72a48f2c3" Apr 22 19:23:44.425516 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:44.425417 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k48jx" podUID="793d5ba1-c977-4404-bd38-8b78e8e5e191" Apr 22 19:23:44.425516 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:44.425201 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:44.425615 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:44.425578 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nndbq" podUID="1c461896-346c-4de1-9362-b9f83bd3486d" Apr 22 19:23:46.161745 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:46.161709 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c461896-346c-4de1-9362-b9f83bd3486d-metrics-certs\") pod \"network-metrics-daemon-nndbq\" (UID: \"1c461896-346c-4de1-9362-b9f83bd3486d\") " pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:46.162228 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:46.161905 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:46.162228 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:46.161989 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c461896-346c-4de1-9362-b9f83bd3486d-metrics-certs podName:1c461896-346c-4de1-9362-b9f83bd3486d nodeName:}" failed. No retries permitted until 2026-04-22 19:24:02.161967461 +0000 UTC m=+34.281346753 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c461896-346c-4de1-9362-b9f83bd3486d-metrics-certs") pod "network-metrics-daemon-nndbq" (UID: "1c461896-346c-4de1-9362-b9f83bd3486d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:46.263006 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:46.262959 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/793d5ba1-c977-4404-bd38-8b78e8e5e191-original-pull-secret\") pod \"global-pull-secret-syncer-k48jx\" (UID: \"793d5ba1-c977-4404-bd38-8b78e8e5e191\") " pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:23:46.263164 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:46.263024 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fqcc\" (UniqueName: \"kubernetes.io/projected/cbfd8869-819e-45c7-9536-08c72a48f2c3-kube-api-access-7fqcc\") pod \"network-check-target-bllx4\" (UID: \"cbfd8869-819e-45c7-9536-08c72a48f2c3\") " pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:23:46.263164 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:46.263149 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:46.263283 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:46.263165 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:46.263283 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:46.263186 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:46.263283 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:46.263198 2574 projected.go:194] Error preparing data for projected volume kube-api-access-7fqcc for pod openshift-network-diagnostics/network-check-target-bllx4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:46.263283 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:46.263229 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/793d5ba1-c977-4404-bd38-8b78e8e5e191-original-pull-secret podName:793d5ba1-c977-4404-bd38-8b78e8e5e191 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:50.263206652 +0000 UTC m=+22.382585955 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/793d5ba1-c977-4404-bd38-8b78e8e5e191-original-pull-secret") pod "global-pull-secret-syncer-k48jx" (UID: "793d5ba1-c977-4404-bd38-8b78e8e5e191") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:46.263283 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:46.263248 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cbfd8869-819e-45c7-9536-08c72a48f2c3-kube-api-access-7fqcc podName:cbfd8869-819e-45c7-9536-08c72a48f2c3 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:02.263238763 +0000 UTC m=+34.382618053 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7fqcc" (UniqueName: "kubernetes.io/projected/cbfd8869-819e-45c7-9536-08c72a48f2c3-kube-api-access-7fqcc") pod "network-check-target-bllx4" (UID: "cbfd8869-819e-45c7-9536-08c72a48f2c3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:46.424871 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:46.424780 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:23:46.424871 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:46.424848 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:46.425096 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:46.424973 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:23:46.425096 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:46.425002 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k48jx" podUID="793d5ba1-c977-4404-bd38-8b78e8e5e191" Apr 22 19:23:46.425096 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:46.425067 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bllx4" podUID="cbfd8869-819e-45c7-9536-08c72a48f2c3" Apr 22 19:23:46.425240 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:46.425166 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nndbq" podUID="1c461896-346c-4de1-9362-b9f83bd3486d" Apr 22 19:23:48.425693 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:48.425493 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:23:48.426417 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:48.425548 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:48.426417 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:48.425753 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k48jx" podUID="793d5ba1-c977-4404-bd38-8b78e8e5e191" Apr 22 19:23:48.426417 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:48.425567 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:23:48.426417 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:48.425876 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nndbq" podUID="1c461896-346c-4de1-9362-b9f83bd3486d" Apr 22 19:23:48.426417 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:48.425960 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bllx4" podUID="cbfd8869-819e-45c7-9536-08c72a48f2c3" Apr 22 19:23:48.492903 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:48.492868 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8z9zt" event={"ID":"32dd8967-d559-441e-95e3-6faf8bc49253","Type":"ContainerStarted","Data":"37ac52e9ac09f471ca3cbd26b9b87dde9a7f94a53484d6b49ab6baad3e52aa7b"} Apr 22 19:23:48.494194 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:48.494162 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcctc" event={"ID":"b948da6e-8c3e-4892-92f1-4f59d7c5c885","Type":"ContainerStarted","Data":"12456a9009c48b2d1ba10e8a81f0625f7c80a5e64f0a4a00a7398e907f583528"} Apr 22 19:23:48.495410 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:48.495388 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jkcf2" event={"ID":"91147599-bdf0-49f5-98ed-a3567eaf56db","Type":"ContainerStarted","Data":"1111520be0045237b500d5d7f8990b4c5dcb75bd9be4ebc3cb75390ffd5d4133"} Apr 22 19:23:48.497243 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:48.497217 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" event={"ID":"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b","Type":"ContainerStarted","Data":"bfdd5531a7c2408d8cbbb36161832eb287184e6deaf20a3bd7e65105eb643cc7"} Apr 22 19:23:48.497354 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:48.497250 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" event={"ID":"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b","Type":"ContainerStarted","Data":"1e0bd7afa2c184438004fadc04c2dd71897b9d63a3a9ed4f856956fa3054c06b"} Apr 22 19:23:48.498867 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:48.498662 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" event={"ID":"159fb4c8-bbd5-4247-849c-f5639e9543f7","Type":"ContainerStarted","Data":"0f2bd95af542824347ae552c42df21472c9018888c29d7e0178d915e29cdf86a"} Apr 22 19:23:48.499971 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:48.499949 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7kfks" event={"ID":"0fdac791-5aa3-4153-bb07-34cec3dbf296","Type":"ContainerStarted","Data":"64cb4247ac8bf67e118f22ae7f408e9da15854594c93bd8e35cdb367153aa78d"} Apr 22 19:23:48.501272 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:48.501252 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cpzbb" event={"ID":"b1c0b054-80a0-4cc5-b053-a4d99268aa8f","Type":"ContainerStarted","Data":"a373aff481d10e2be393eceb981ecec8367c8c3e4422ec2354030d6d7587dee8"} Apr 22 19:23:48.502459 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:48.502431 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-42mdb" event={"ID":"76e397fd-d6b3-4cfb-aa90-fb57dfa68ba4","Type":"ContainerStarted","Data":"7e02ebab82420df172c108e48e00579f2e35e80de65a4df96456fcf04775dc96"} Apr 22 19:23:48.517164 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:48.517116 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8z9zt" podStartSLOduration=3.604620771 podStartE2EDuration="20.517099975s" podCreationTimestamp="2026-04-22 19:23:28 +0000 UTC" firstStartedPulling="2026-04-22 19:23:31.11986954 +0000 UTC m=+3.239248828" lastFinishedPulling="2026-04-22 19:23:48.032348744 +0000 UTC m=+20.151728032" observedRunningTime="2026-04-22 19:23:48.50979222 +0000 UTC m=+20.629171528" watchObservedRunningTime="2026-04-22 19:23:48.517099975 +0000 UTC m=+20.636479285" Apr 22 19:23:48.526257 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:48.526220 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-42mdb" podStartSLOduration=3.629001966 podStartE2EDuration="20.526209625s" podCreationTimestamp="2026-04-22 19:23:28 +0000 UTC" firstStartedPulling="2026-04-22 19:23:31.115769824 +0000 UTC m=+3.235149126" lastFinishedPulling="2026-04-22 19:23:48.012977494 +0000 UTC m=+20.132356785" observedRunningTime="2026-04-22 19:23:48.52571039 +0000 UTC m=+20.645089698" watchObservedRunningTime="2026-04-22 19:23:48.526209625 +0000 UTC m=+20.645588933" Apr 22 19:23:48.539837 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:48.539765 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-cpzbb" podStartSLOduration=3.682022457 podStartE2EDuration="20.539746512s" podCreationTimestamp="2026-04-22 19:23:28 +0000 UTC" firstStartedPulling="2026-04-22 19:23:31.118311854 +0000 UTC m=+3.237691143" lastFinishedPulling="2026-04-22 19:23:47.976035911 +0000 UTC m=+20.095415198" observedRunningTime="2026-04-22 19:23:48.539614514 +0000 UTC m=+20.658993824" watchObservedRunningTime="2026-04-22 19:23:48.539746512 +0000 UTC m=+20.659125822" Apr 22 19:23:48.555266 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:48.555204 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jkcf2" podStartSLOduration=11.732777992 podStartE2EDuration="20.555185129s" podCreationTimestamp="2026-04-22 19:23:28 +0000 UTC" firstStartedPulling="2026-04-22 19:23:31.121869105 +0000 UTC m=+3.241248407" lastFinishedPulling="2026-04-22 19:23:39.944276253 +0000 UTC m=+12.063655544" observedRunningTime="2026-04-22 19:23:48.554414865 +0000 UTC m=+20.673794174" watchObservedRunningTime="2026-04-22 19:23:48.555185129 +0000 UTC m=+20.674564457" Apr 22 19:23:48.572336 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:48.572174 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-7kfks" podStartSLOduration=3.67987 podStartE2EDuration="20.572157912s" podCreationTimestamp="2026-04-22 19:23:28 +0000 UTC" firstStartedPulling="2026-04-22 19:23:31.123554083 +0000 UTC m=+3.242933384" lastFinishedPulling="2026-04-22 19:23:48.015841996 +0000 UTC m=+20.135221296" observedRunningTime="2026-04-22 19:23:48.571959867 +0000 UTC m=+20.691339178" watchObservedRunningTime="2026-04-22 19:23:48.572157912 +0000 UTC m=+20.691537223" Apr 22 19:23:49.213769 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:49.213685 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-42mdb" Apr 22 19:23:49.214351 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:49.214332 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-42mdb" Apr 22 19:23:49.506496 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:49.506469 2574 generic.go:358] "Generic (PLEG): container finished" podID="b948da6e-8c3e-4892-92f1-4f59d7c5c885" containerID="12456a9009c48b2d1ba10e8a81f0625f7c80a5e64f0a4a00a7398e907f583528" exitCode=0 Apr 22 19:23:49.506915 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:49.506563 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcctc" event={"ID":"b948da6e-8c3e-4892-92f1-4f59d7c5c885","Type":"ContainerDied","Data":"12456a9009c48b2d1ba10e8a81f0625f7c80a5e64f0a4a00a7398e907f583528"} Apr 22 19:23:49.509142 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:49.509124 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-acl-logging/0.log" Apr 22 19:23:49.509414 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:49.509393 2574 generic.go:358] "Generic (PLEG): container finished" podID="5879d8e5-623a-4ec2-9a22-b0b6c0c5917b" containerID="bfdd5531a7c2408d8cbbb36161832eb287184e6deaf20a3bd7e65105eb643cc7" exitCode=1 Apr 22 19:23:49.509522 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:49.509480 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" event={"ID":"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b","Type":"ContainerDied","Data":"bfdd5531a7c2408d8cbbb36161832eb287184e6deaf20a3bd7e65105eb643cc7"} Apr 22 19:23:49.509594 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:49.509523 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" event={"ID":"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b","Type":"ContainerStarted","Data":"fb62301f640dce7b732620230adcebc38f42e48b7dedac65baa19f515dae59d1"} Apr 22 19:23:49.509594 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:49.509537 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" event={"ID":"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b","Type":"ContainerStarted","Data":"6f44655d902765345f2c7ba159cb740d58ca68809e7dad279384a4e0b0628405"} Apr 22 19:23:49.509594 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:49.509549 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" event={"ID":"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b","Type":"ContainerStarted","Data":"d54f42b3d3c8efd80e07c9c465dca630a9a3234edb95b213e64e8199c73d18fa"} Apr 22 19:23:49.509594 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:49.509561 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" event={"ID":"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b","Type":"ContainerStarted","Data":"4231fd748f44585e20471167028f06919a8e2eab68ccc8b17ab2a2f037fac54d"} Apr 22 19:23:49.509783 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:49.509669 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-42mdb" Apr 22 19:23:49.510214 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:49.510194 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-42mdb" Apr 22 19:23:49.686188 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:49.686038 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:23:50.293597 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:50.293564 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/793d5ba1-c977-4404-bd38-8b78e8e5e191-original-pull-secret\") pod \"global-pull-secret-syncer-k48jx\" (UID: \"793d5ba1-c977-4404-bd38-8b78e8e5e191\") " pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:23:50.293830 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:50.293700 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:50.293830 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:50.293772 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/793d5ba1-c977-4404-bd38-8b78e8e5e191-original-pull-secret podName:793d5ba1-c977-4404-bd38-8b78e8e5e191 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:58.293750522 +0000 UTC m=+30.413129809 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/793d5ba1-c977-4404-bd38-8b78e8e5e191-original-pull-secret") pod "global-pull-secret-syncer-k48jx" (UID: "793d5ba1-c977-4404-bd38-8b78e8e5e191") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:50.388073 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:50.387955 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:23:49.686179792Z","UUID":"32faa15f-bb85-4a5e-83e7-2e9deae138e8","Handler":null,"Name":"","Endpoint":""} Apr 22 19:23:50.390034 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:50.390001 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:23:50.390034 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:50.390032 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:23:50.424674 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:50.424643 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:23:50.424855 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:50.424654 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:23:50.424855 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:50.424769 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k48jx" podUID="793d5ba1-c977-4404-bd38-8b78e8e5e191" Apr 22 19:23:50.424979 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:50.424873 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bllx4" podUID="cbfd8869-819e-45c7-9536-08c72a48f2c3" Apr 22 19:23:50.424979 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:50.424654 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:50.425078 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:50.425038 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nndbq" podUID="1c461896-346c-4de1-9362-b9f83bd3486d" Apr 22 19:23:50.513084 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:50.513046 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" event={"ID":"159fb4c8-bbd5-4247-849c-f5639e9543f7","Type":"ContainerStarted","Data":"426802bb5d19c23c62fc3260fd94ed71440afd39fcc123e113ed24758d80447c"} Apr 22 19:23:50.514530 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:50.514491 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-85tpl" event={"ID":"7dbaab45-2adf-4e5c-b969-f8d3eb83ea37","Type":"ContainerStarted","Data":"b1cfc63290d8da17b8197df8790b5291f2a339edd448e147139624955845172e"} Apr 22 19:23:50.532272 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:50.532219 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-85tpl" podStartSLOduration=5.639753681 podStartE2EDuration="22.53220449s" podCreationTimestamp="2026-04-22 19:23:28 +0000 UTC" firstStartedPulling="2026-04-22 19:23:31.120635772 +0000 UTC m=+3.240015059" lastFinishedPulling="2026-04-22 19:23:48.013086574 +0000 UTC m=+20.132465868" observedRunningTime="2026-04-22 19:23:50.532005648 +0000 UTC m=+22.651384958" watchObservedRunningTime="2026-04-22 19:23:50.53220449 +0000 UTC m=+22.651583801" Apr 22 19:23:51.518979 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:51.518949 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-acl-logging/0.log" Apr 22 19:23:51.519643 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:51.519320 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" event={"ID":"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b","Type":"ContainerStarted","Data":"540373cae72a34f6a343d7b1e398b3f320cabf4b807436d4fe443ef04b6c84ba"} Apr 22 19:23:51.532569 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:51.532512 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" event={"ID":"159fb4c8-bbd5-4247-849c-f5639e9543f7","Type":"ContainerStarted","Data":"ac31d248ebf68796d67a08fc74fdfd13a0316347cf9dd66b78a9db8eb1b0ff5b"} Apr 22 19:23:51.554214 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:51.554163 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm4wg" podStartSLOduration=3.754317256 podStartE2EDuration="23.554148072s" podCreationTimestamp="2026-04-22 19:23:28 +0000 UTC" firstStartedPulling="2026-04-22 19:23:31.110639773 +0000 UTC m=+3.230019059" lastFinishedPulling="2026-04-22 19:23:50.910470588 +0000 UTC m=+23.029849875" observedRunningTime="2026-04-22 19:23:51.554099457 +0000 UTC m=+23.673478766" watchObservedRunningTime="2026-04-22 19:23:51.554148072 +0000 UTC m=+23.673527396" Apr 22 19:23:52.425019 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:52.424984 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:23:52.425202 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:52.424984 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:23:52.425202 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:52.425100 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k48jx" podUID="793d5ba1-c977-4404-bd38-8b78e8e5e191" Apr 22 19:23:52.425202 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:52.424984 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:52.425202 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:52.425183 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bllx4" podUID="cbfd8869-819e-45c7-9536-08c72a48f2c3" Apr 22 19:23:52.425380 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:52.425276 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nndbq" podUID="1c461896-346c-4de1-9362-b9f83bd3486d" Apr 22 19:23:53.541459 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:53.540973 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-acl-logging/0.log" Apr 22 19:23:53.546052 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:53.545935 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" event={"ID":"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b","Type":"ContainerStarted","Data":"d633b727e934510c85dbe9fd461957dffd270bbef80cfc0d0597ad3facc35edc"} Apr 22 19:23:53.546865 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:53.546690 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:53.546865 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:53.546847 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:53.547458 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:53.546851 2574 scope.go:117] "RemoveContainer" containerID="bfdd5531a7c2408d8cbbb36161832eb287184e6deaf20a3bd7e65105eb643cc7" Apr 22 19:23:53.565576 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:53.565504 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:53.578291 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:53.578275 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:54.424593 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:54.424560 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:54.424755 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:54.424560 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:23:54.424755 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:54.424668 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nndbq" podUID="1c461896-346c-4de1-9362-b9f83bd3486d" Apr 22 19:23:54.424755 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:54.424561 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:23:54.424755 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:54.424725 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bllx4" podUID="cbfd8869-819e-45c7-9536-08c72a48f2c3" Apr 22 19:23:54.424979 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:54.424819 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k48jx" podUID="793d5ba1-c977-4404-bd38-8b78e8e5e191" Apr 22 19:23:54.550538 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:54.550507 2574 generic.go:358] "Generic (PLEG): container finished" podID="b948da6e-8c3e-4892-92f1-4f59d7c5c885" containerID="56c5df11fe5c990307d05bc405740bd5a1eeb348230ad4686520018c41010212" exitCode=0 Apr 22 19:23:54.550943 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:54.550553 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcctc" event={"ID":"b948da6e-8c3e-4892-92f1-4f59d7c5c885","Type":"ContainerDied","Data":"56c5df11fe5c990307d05bc405740bd5a1eeb348230ad4686520018c41010212"} Apr 22 19:23:54.553790 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:54.553774 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-acl-logging/0.log" Apr 22 19:23:54.554111 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:54.554088 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" event={"ID":"5879d8e5-623a-4ec2-9a22-b0b6c0c5917b","Type":"ContainerStarted","Data":"f2a6041ad8499c9cc3091a495e354e02d7d43a69dfd2cdb9763a7b7cad9dcd10"} Apr 22 19:23:54.554220 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:54.554206 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 19:23:54.600501 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:54.598668 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" podStartSLOduration=9.640182377 podStartE2EDuration="26.598647778s" podCreationTimestamp="2026-04-22 19:23:28 +0000 UTC" firstStartedPulling="2026-04-22 19:23:31.114026238 +0000 UTC m=+3.233405530" lastFinishedPulling="2026-04-22 19:23:48.07249163 +0000 UTC m=+20.191870931" observedRunningTime="2026-04-22 19:23:54.596793787 +0000 UTC m=+26.716173109" watchObservedRunningTime="2026-04-22 19:23:54.598647778 +0000 UTC m=+26.718027088" Apr 22 19:23:55.446460 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:55.446270 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nndbq"] Apr 22 19:23:55.446599 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:55.446548 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:55.446669 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:55.446650 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nndbq" podUID="1c461896-346c-4de1-9362-b9f83bd3486d" Apr 22 19:23:55.446994 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:55.446970 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-k48jx"] Apr 22 19:23:55.447214 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:55.447103 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:23:55.447214 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:55.447203 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k48jx" podUID="793d5ba1-c977-4404-bd38-8b78e8e5e191" Apr 22 19:23:55.447689 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:55.447654 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bllx4"] Apr 22 19:23:55.447781 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:55.447771 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:23:55.447910 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:55.447888 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bllx4" podUID="cbfd8869-819e-45c7-9536-08c72a48f2c3" Apr 22 19:23:55.557423 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:55.557394 2574 generic.go:358] "Generic (PLEG): container finished" podID="b948da6e-8c3e-4892-92f1-4f59d7c5c885" containerID="7eaf1fac246f0e3a49bf6c4c70026dd00379c8a8f4fb8b6d117e3958123731f5" exitCode=0 Apr 22 19:23:55.557821 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:55.557483 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcctc" event={"ID":"b948da6e-8c3e-4892-92f1-4f59d7c5c885","Type":"ContainerDied","Data":"7eaf1fac246f0e3a49bf6c4c70026dd00379c8a8f4fb8b6d117e3958123731f5"} Apr 22 19:23:55.557821 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:55.557605 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 19:23:56.561447 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:56.561412 2574 generic.go:358] "Generic (PLEG): container finished" podID="b948da6e-8c3e-4892-92f1-4f59d7c5c885" containerID="5fe5a40bc06a0c3e8018a88aedd4415fc592b197cab5dec70b593a07609b4d22" exitCode=0 Apr 22 19:23:56.561914 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:56.561473 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcctc" event={"ID":"b948da6e-8c3e-4892-92f1-4f59d7c5c885","Type":"ContainerDied","Data":"5fe5a40bc06a0c3e8018a88aedd4415fc592b197cab5dec70b593a07609b4d22"} Apr 22 19:23:57.425247 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:57.425214 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:23:57.425433 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:57.425264 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:23:57.425433 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:57.425272 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:57.425433 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:57.425368 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k48jx" podUID="793d5ba1-c977-4404-bd38-8b78e8e5e191" Apr 22 19:23:57.425433 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:57.425413 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bllx4" podUID="cbfd8869-819e-45c7-9536-08c72a48f2c3" Apr 22 19:23:57.425634 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:57.425521 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nndbq" podUID="1c461896-346c-4de1-9362-b9f83bd3486d" Apr 22 19:23:58.349063 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:58.349026 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/793d5ba1-c977-4404-bd38-8b78e8e5e191-original-pull-secret\") pod \"global-pull-secret-syncer-k48jx\" (UID: \"793d5ba1-c977-4404-bd38-8b78e8e5e191\") " pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:23:58.349715 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:58.349200 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:58.349715 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:58.349272 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/793d5ba1-c977-4404-bd38-8b78e8e5e191-original-pull-secret podName:793d5ba1-c977-4404-bd38-8b78e8e5e191 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:14.34925093 +0000 UTC m=+46.468630231 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/793d5ba1-c977-4404-bd38-8b78e8e5e191-original-pull-secret") pod "global-pull-secret-syncer-k48jx" (UID: "793d5ba1-c977-4404-bd38-8b78e8e5e191") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:58.811792 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:58.811749 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:23:58.812030 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:58.812014 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 19:23:58.822767 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:58.822721 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" podUID="5879d8e5-623a-4ec2-9a22-b0b6c0c5917b" containerName="ovnkube-controller" probeResult="failure" output="" Apr 22 19:23:58.832257 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:58.832223 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" podUID="5879d8e5-623a-4ec2-9a22-b0b6c0c5917b" containerName="ovnkube-controller" probeResult="failure" output="" Apr 22 19:23:59.425061 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:59.425029 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:23:59.425575 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:59.425029 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:23:59.425575 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:59.425123 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k48jx" podUID="793d5ba1-c977-4404-bd38-8b78e8e5e191" Apr 22 19:23:59.425575 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:23:59.425035 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:23:59.425575 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:59.425212 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bllx4" podUID="cbfd8869-819e-45c7-9536-08c72a48f2c3" Apr 22 19:23:59.425575 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:23:59.425311 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nndbq" podUID="1c461896-346c-4de1-9362-b9f83bd3486d" Apr 22 19:24:01.221957 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.221887 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-145.ec2.internal" event="NodeReady" Apr 22 19:24:01.222392 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.222021 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:24:01.258383 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.258347 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-67c7f85b4c-gsf88"] Apr 22 19:24:01.263317 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.263284 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.266297 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.266274 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 19:24:01.266776 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.266679 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 19:24:01.266912 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.266858 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 19:24:01.267033 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.267014 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-df49v\"" Apr 22 19:24:01.270784 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.270761 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jb2x5"] Apr 22 19:24:01.272987 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.272961 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 19:24:01.274393 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.274358 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kn8hz"] Apr 22 19:24:01.274541 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.274522 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jb2x5" Apr 22 19:24:01.277391 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.277334 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bspvd\"" Apr 22 19:24:01.277516 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.277498 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:24:01.277621 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.277606 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:24:01.278248 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.278219 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-67c7f85b4c-gsf88"] Apr 22 19:24:01.278341 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.278316 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kn8hz" Apr 22 19:24:01.280677 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.280656 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:24:01.280880 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.280816 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:24:01.281014 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.280996 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:24:01.281358 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.281340 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wknft\"" Apr 22 19:24:01.285032 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.285010 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jb2x5"] Apr 22 19:24:01.286981 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.286959 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kn8hz"] Apr 22 19:24:01.372240 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.372205 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e544a780-e5f6-409b-b57c-0e80f0766bb1-tmp-dir\") pod \"dns-default-jb2x5\" (UID: \"e544a780-e5f6-409b-b57c-0e80f0766bb1\") " pod="openshift-dns/dns-default-jb2x5" Apr 22 19:24:01.372421 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.372251 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxrmt\" (UniqueName: \"kubernetes.io/projected/33fcaecf-093b-4a6b-9bed-0310951d4825-kube-api-access-kxrmt\") pod \"ingress-canary-kn8hz\" (UID: \"33fcaecf-093b-4a6b-9bed-0310951d4825\") " pod="openshift-ingress-canary/ingress-canary-kn8hz" Apr 22 19:24:01.372421 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.372303 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgg6n\" (UniqueName: \"kubernetes.io/projected/e544a780-e5f6-409b-b57c-0e80f0766bb1-kube-api-access-bgg6n\") pod \"dns-default-jb2x5\" (UID: \"e544a780-e5f6-409b-b57c-0e80f0766bb1\") " pod="openshift-dns/dns-default-jb2x5" Apr 22 19:24:01.372421 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.372335 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fd234ae5-2ef2-482b-9874-6902bc15a04a-image-registry-private-configuration\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.372421 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.372365 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p9wm\" (UniqueName: \"kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-kube-api-access-9p9wm\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.372421 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.372399 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e544a780-e5f6-409b-b57c-0e80f0766bb1-metrics-tls\") pod \"dns-default-jb2x5\" (UID: \"e544a780-e5f6-409b-b57c-0e80f0766bb1\") " pod="openshift-dns/dns-default-jb2x5" Apr 22 19:24:01.372630 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.372457 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33fcaecf-093b-4a6b-9bed-0310951d4825-cert\") pod \"ingress-canary-kn8hz\" (UID: \"33fcaecf-093b-4a6b-9bed-0310951d4825\") " pod="openshift-ingress-canary/ingress-canary-kn8hz" Apr 22 19:24:01.372630 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.372514 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fd234ae5-2ef2-482b-9874-6902bc15a04a-installation-pull-secrets\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.372630 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.372540 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e544a780-e5f6-409b-b57c-0e80f0766bb1-config-volume\") pod \"dns-default-jb2x5\" (UID: \"e544a780-e5f6-409b-b57c-0e80f0766bb1\") " pod="openshift-dns/dns-default-jb2x5" Apr 22 19:24:01.372630 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.372572 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd234ae5-2ef2-482b-9874-6902bc15a04a-trusted-ca\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.372630 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.372603 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fd234ae5-2ef2-482b-9874-6902bc15a04a-ca-trust-extracted\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.372875 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.372660 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-tls\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.372875 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.372698 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-bound-sa-token\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.372875 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.372747 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-certificates\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.424779 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.424740 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:24:01.424963 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.424785 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:24:01.425028 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.424972 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:24:01.427606 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.427574 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:24:01.427606 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.427584 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7c2z6\"" Apr 22 19:24:01.427841 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.427623 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 19:24:01.427841 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.427706 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tqt8b\"" Apr 22 19:24:01.428025 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.428007 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:24:01.428098 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.428007 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:24:01.473385 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.473322 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33fcaecf-093b-4a6b-9bed-0310951d4825-cert\") pod \"ingress-canary-kn8hz\" (UID: \"33fcaecf-093b-4a6b-9bed-0310951d4825\") " pod="openshift-ingress-canary/ingress-canary-kn8hz" Apr 22 19:24:01.473385 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.473370 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fd234ae5-2ef2-482b-9874-6902bc15a04a-installation-pull-secrets\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.473555 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.473397 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e544a780-e5f6-409b-b57c-0e80f0766bb1-config-volume\") pod \"dns-default-jb2x5\" (UID: \"e544a780-e5f6-409b-b57c-0e80f0766bb1\") " pod="openshift-dns/dns-default-jb2x5" Apr 22 19:24:01.473555 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.473431 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd234ae5-2ef2-482b-9874-6902bc15a04a-trusted-ca\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.473555 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:01.473482 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:01.473658 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.473553 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fd234ae5-2ef2-482b-9874-6902bc15a04a-ca-trust-extracted\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.474249 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:01.474226 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33fcaecf-093b-4a6b-9bed-0310951d4825-cert podName:33fcaecf-093b-4a6b-9bed-0310951d4825 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:01.974191144 +0000 UTC m=+34.093570440 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33fcaecf-093b-4a6b-9bed-0310951d4825-cert") pod "ingress-canary-kn8hz" (UID: "33fcaecf-093b-4a6b-9bed-0310951d4825") : secret "canary-serving-cert" not found Apr 22 19:24:01.474391 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.474274 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-tls\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.474391 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.474320 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-bound-sa-token\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.474391 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.474372 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-certificates\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.474549 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.474410 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e544a780-e5f6-409b-b57c-0e80f0766bb1-tmp-dir\") pod \"dns-default-jb2x5\" (UID: \"e544a780-e5f6-409b-b57c-0e80f0766bb1\") " pod="openshift-dns/dns-default-jb2x5" Apr 22 19:24:01.474549 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.474448 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxrmt\" (UniqueName: \"kubernetes.io/projected/33fcaecf-093b-4a6b-9bed-0310951d4825-kube-api-access-kxrmt\") pod \"ingress-canary-kn8hz\" (UID: \"33fcaecf-093b-4a6b-9bed-0310951d4825\") " pod="openshift-ingress-canary/ingress-canary-kn8hz" Apr 22 19:24:01.474549 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.474480 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgg6n\" (UniqueName: \"kubernetes.io/projected/e544a780-e5f6-409b-b57c-0e80f0766bb1-kube-api-access-bgg6n\") pod \"dns-default-jb2x5\" (UID: \"e544a780-e5f6-409b-b57c-0e80f0766bb1\") " pod="openshift-dns/dns-default-jb2x5" Apr 22 19:24:01.474549 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.474524 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fd234ae5-2ef2-482b-9874-6902bc15a04a-image-registry-private-configuration\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.474780 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.474561 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9p9wm\" (UniqueName: \"kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-kube-api-access-9p9wm\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.474780 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.474597 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e544a780-e5f6-409b-b57c-0e80f0766bb1-metrics-tls\") pod \"dns-default-jb2x5\" (UID: \"e544a780-e5f6-409b-b57c-0e80f0766bb1\") " pod="openshift-dns/dns-default-jb2x5" Apr 22 19:24:01.475614 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:01.474958 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:01.475614 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:01.475020 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e544a780-e5f6-409b-b57c-0e80f0766bb1-metrics-tls podName:e544a780-e5f6-409b-b57c-0e80f0766bb1 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:01.975003366 +0000 UTC m=+34.094382666 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e544a780-e5f6-409b-b57c-0e80f0766bb1-metrics-tls") pod "dns-default-jb2x5" (UID: "e544a780-e5f6-409b-b57c-0e80f0766bb1") : secret "dns-default-metrics-tls" not found Apr 22 19:24:01.475614 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.475127 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e544a780-e5f6-409b-b57c-0e80f0766bb1-tmp-dir\") pod \"dns-default-jb2x5\" (UID: \"e544a780-e5f6-409b-b57c-0e80f0766bb1\") " pod="openshift-dns/dns-default-jb2x5" Apr 22 19:24:01.476060 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:01.475880 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:01.476060 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:01.475898 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67c7f85b4c-gsf88: secret "image-registry-tls" not found Apr 22 19:24:01.476060 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.476004 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e544a780-e5f6-409b-b57c-0e80f0766bb1-config-volume\") pod \"dns-default-jb2x5\" (UID: \"e544a780-e5f6-409b-b57c-0e80f0766bb1\") " pod="openshift-dns/dns-default-jb2x5" Apr 22 19:24:01.476242 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:01.476074 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-tls podName:fd234ae5-2ef2-482b-9874-6902bc15a04a nodeName:}" failed. No retries permitted until 2026-04-22 19:24:01.97605523 +0000 UTC m=+34.095434517 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-tls") pod "image-registry-67c7f85b4c-gsf88" (UID: "fd234ae5-2ef2-482b-9874-6902bc15a04a") : secret "image-registry-tls" not found Apr 22 19:24:01.478128 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.476467 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fd234ae5-2ef2-482b-9874-6902bc15a04a-ca-trust-extracted\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.478128 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.477050 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-certificates\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.478128 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.477258 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd234ae5-2ef2-482b-9874-6902bc15a04a-trusted-ca\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.479852 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.479669 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fd234ae5-2ef2-482b-9874-6902bc15a04a-image-registry-private-configuration\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.479931 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.479866 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fd234ae5-2ef2-482b-9874-6902bc15a04a-installation-pull-secrets\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.486955 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.486910 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgg6n\" (UniqueName: \"kubernetes.io/projected/e544a780-e5f6-409b-b57c-0e80f0766bb1-kube-api-access-bgg6n\") pod \"dns-default-jb2x5\" (UID: \"e544a780-e5f6-409b-b57c-0e80f0766bb1\") " pod="openshift-dns/dns-default-jb2x5" Apr 22 19:24:01.487279 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.487250 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxrmt\" (UniqueName: \"kubernetes.io/projected/33fcaecf-093b-4a6b-9bed-0310951d4825-kube-api-access-kxrmt\") pod \"ingress-canary-kn8hz\" (UID: \"33fcaecf-093b-4a6b-9bed-0310951d4825\") " pod="openshift-ingress-canary/ingress-canary-kn8hz" Apr 22 19:24:01.487410 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.487390 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p9wm\" (UniqueName: \"kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-kube-api-access-9p9wm\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.488123 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.488104 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-bound-sa-token\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.583547 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.583514 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-d2sdz"] Apr 22 19:24:01.588068 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.588047 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d2sdz" Apr 22 19:24:01.590670 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.590647 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 19:24:01.590951 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.590837 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-5b8rs\"" Apr 22 19:24:01.590951 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.590859 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 19:24:01.597626 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.597587 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-d2sdz"] Apr 22 19:24:01.676455 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.676421 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbtw2\" (UniqueName: \"kubernetes.io/projected/80f993c4-91be-4779-90b0-4d41d6f29f4e-kube-api-access-zbtw2\") pod \"migrator-74bb7799d9-d2sdz\" (UID: \"80f993c4-91be-4779-90b0-4d41d6f29f4e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d2sdz" Apr 22 19:24:01.777282 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.777242 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbtw2\" (UniqueName: \"kubernetes.io/projected/80f993c4-91be-4779-90b0-4d41d6f29f4e-kube-api-access-zbtw2\") pod \"migrator-74bb7799d9-d2sdz\" (UID: \"80f993c4-91be-4779-90b0-4d41d6f29f4e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d2sdz" Apr 22 19:24:01.789320 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.789297 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbtw2\" (UniqueName: \"kubernetes.io/projected/80f993c4-91be-4779-90b0-4d41d6f29f4e-kube-api-access-zbtw2\") pod \"migrator-74bb7799d9-d2sdz\" (UID: \"80f993c4-91be-4779-90b0-4d41d6f29f4e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d2sdz" Apr 22 19:24:01.899027 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.898990 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d2sdz" Apr 22 19:24:01.978177 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.978132 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e544a780-e5f6-409b-b57c-0e80f0766bb1-metrics-tls\") pod \"dns-default-jb2x5\" (UID: \"e544a780-e5f6-409b-b57c-0e80f0766bb1\") " pod="openshift-dns/dns-default-jb2x5" Apr 22 19:24:01.978354 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.978215 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33fcaecf-093b-4a6b-9bed-0310951d4825-cert\") pod \"ingress-canary-kn8hz\" (UID: \"33fcaecf-093b-4a6b-9bed-0310951d4825\") " pod="openshift-ingress-canary/ingress-canary-kn8hz" Apr 22 19:24:01.978354 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:01.978274 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:01.978354 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:01.978330 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:01.978354 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:01.978282 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-tls\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:01.978354 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:01.978352 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e544a780-e5f6-409b-b57c-0e80f0766bb1-metrics-tls podName:e544a780-e5f6-409b-b57c-0e80f0766bb1 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:02.978335289 +0000 UTC m=+35.097714576 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e544a780-e5f6-409b-b57c-0e80f0766bb1-metrics-tls") pod "dns-default-jb2x5" (UID: "e544a780-e5f6-409b-b57c-0e80f0766bb1") : secret "dns-default-metrics-tls" not found Apr 22 19:24:01.978354 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:01.978355 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:01.978647 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:01.978370 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67c7f85b4c-gsf88: secret "image-registry-tls" not found Apr 22 19:24:01.978647 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:01.978397 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33fcaecf-093b-4a6b-9bed-0310951d4825-cert podName:33fcaecf-093b-4a6b-9bed-0310951d4825 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:02.978383747 +0000 UTC m=+35.097763039 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33fcaecf-093b-4a6b-9bed-0310951d4825-cert") pod "ingress-canary-kn8hz" (UID: "33fcaecf-093b-4a6b-9bed-0310951d4825") : secret "canary-serving-cert" not found Apr 22 19:24:01.978647 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:01.978423 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-tls podName:fd234ae5-2ef2-482b-9874-6902bc15a04a nodeName:}" failed. No retries permitted until 2026-04-22 19:24:02.978409256 +0000 UTC m=+35.097788550 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-tls") pod "image-registry-67c7f85b4c-gsf88" (UID: "fd234ae5-2ef2-482b-9874-6902bc15a04a") : secret "image-registry-tls" not found Apr 22 19:24:02.176024 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:02.175992 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-d2sdz"] Apr 22 19:24:02.180295 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:02.180099 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c461896-346c-4de1-9362-b9f83bd3486d-metrics-certs\") pod \"network-metrics-daemon-nndbq\" (UID: \"1c461896-346c-4de1-9362-b9f83bd3486d\") " pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:24:02.180295 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:02.180286 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:24:02.180475 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:02.180365 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c461896-346c-4de1-9362-b9f83bd3486d-metrics-certs podName:1c461896-346c-4de1-9362-b9f83bd3486d nodeName:}" failed. No retries permitted until 2026-04-22 19:24:34.18034481 +0000 UTC m=+66.299724097 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c461896-346c-4de1-9362-b9f83bd3486d-metrics-certs") pod "network-metrics-daemon-nndbq" (UID: "1c461896-346c-4de1-9362-b9f83bd3486d") : secret "metrics-daemon-secret" not found Apr 22 19:24:02.180475 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:24:02.180384 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80f993c4_91be_4779_90b0_4d41d6f29f4e.slice/crio-d801a48000e3f588c0c6f83437a602ededf4346dd47850eb40615842f69b487b WatchSource:0}: Error finding container d801a48000e3f588c0c6f83437a602ededf4346dd47850eb40615842f69b487b: Status 404 returned error can't find the container with id d801a48000e3f588c0c6f83437a602ededf4346dd47850eb40615842f69b487b Apr 22 19:24:02.215229 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:02.215211 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cpzbb_b1c0b054-80a0-4cc5-b053-a4d99268aa8f/dns-node-resolver/0.log" Apr 22 19:24:02.280903 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:02.280877 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fqcc\" (UniqueName: \"kubernetes.io/projected/cbfd8869-819e-45c7-9536-08c72a48f2c3-kube-api-access-7fqcc\") pod \"network-check-target-bllx4\" (UID: \"cbfd8869-819e-45c7-9536-08c72a48f2c3\") " pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:24:02.284136 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:02.284113 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fqcc\" (UniqueName: \"kubernetes.io/projected/cbfd8869-819e-45c7-9536-08c72a48f2c3-kube-api-access-7fqcc\") pod \"network-check-target-bllx4\" (UID: \"cbfd8869-819e-45c7-9536-08c72a48f2c3\") " pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:24:02.351409 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:02.351339 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:24:02.467600 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:02.467568 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bllx4"] Apr 22 19:24:02.470848 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:24:02.470797 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbfd8869_819e_45c7_9536_08c72a48f2c3.slice/crio-33e6baa516fdbe292715bfb2480248ae5e642ef0944a60694e20be7298c26f79 WatchSource:0}: Error finding container 33e6baa516fdbe292715bfb2480248ae5e642ef0944a60694e20be7298c26f79: Status 404 returned error can't find the container with id 33e6baa516fdbe292715bfb2480248ae5e642ef0944a60694e20be7298c26f79 Apr 22 19:24:02.576108 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:02.575919 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bllx4" event={"ID":"cbfd8869-819e-45c7-9536-08c72a48f2c3","Type":"ContainerStarted","Data":"33e6baa516fdbe292715bfb2480248ae5e642ef0944a60694e20be7298c26f79"} Apr 22 19:24:02.576848 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:02.576824 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d2sdz" event={"ID":"80f993c4-91be-4779-90b0-4d41d6f29f4e","Type":"ContainerStarted","Data":"d801a48000e3f588c0c6f83437a602ededf4346dd47850eb40615842f69b487b"} Apr 22 19:24:02.579280 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:02.579253 2574 generic.go:358] "Generic (PLEG): container finished" podID="b948da6e-8c3e-4892-92f1-4f59d7c5c885" containerID="43ef4025ba7c82b3d3f72e2f4e885d6d1f3fb80195de3388a0bc47cffd3989be" exitCode=0 Apr 22 19:24:02.579373 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:02.579300 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcctc" event={"ID":"b948da6e-8c3e-4892-92f1-4f59d7c5c885","Type":"ContainerDied","Data":"43ef4025ba7c82b3d3f72e2f4e885d6d1f3fb80195de3388a0bc47cffd3989be"} Apr 22 19:24:02.797122 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:02.797100 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jkcf2_91147599-bdf0-49f5-98ed-a3567eaf56db/node-ca/0.log" Apr 22 19:24:02.986849 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:02.986767 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e544a780-e5f6-409b-b57c-0e80f0766bb1-metrics-tls\") pod \"dns-default-jb2x5\" (UID: \"e544a780-e5f6-409b-b57c-0e80f0766bb1\") " pod="openshift-dns/dns-default-jb2x5" Apr 22 19:24:02.986979 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:02.986865 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33fcaecf-093b-4a6b-9bed-0310951d4825-cert\") pod \"ingress-canary-kn8hz\" (UID: \"33fcaecf-093b-4a6b-9bed-0310951d4825\") " pod="openshift-ingress-canary/ingress-canary-kn8hz" Apr 22 19:24:02.986979 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:02.986919 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:02.987048 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:02.986982 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e544a780-e5f6-409b-b57c-0e80f0766bb1-metrics-tls podName:e544a780-e5f6-409b-b57c-0e80f0766bb1 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:04.986964434 +0000 UTC m=+37.106343742 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e544a780-e5f6-409b-b57c-0e80f0766bb1-metrics-tls") pod "dns-default-jb2x5" (UID: "e544a780-e5f6-409b-b57c-0e80f0766bb1") : secret "dns-default-metrics-tls" not found Apr 22 19:24:02.987048 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:02.986984 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:02.987048 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:02.986998 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67c7f85b4c-gsf88: secret "image-registry-tls" not found Apr 22 19:24:02.987048 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:02.986919 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-tls\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:02.987048 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:02.986996 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:02.987048 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:02.987030 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-tls podName:fd234ae5-2ef2-482b-9874-6902bc15a04a nodeName:}" failed. No retries permitted until 2026-04-22 19:24:04.987020128 +0000 UTC m=+37.106399416 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-tls") pod "image-registry-67c7f85b4c-gsf88" (UID: "fd234ae5-2ef2-482b-9874-6902bc15a04a") : secret "image-registry-tls" not found Apr 22 19:24:02.987228 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:02.987062 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33fcaecf-093b-4a6b-9bed-0310951d4825-cert podName:33fcaecf-093b-4a6b-9bed-0310951d4825 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:04.987049034 +0000 UTC m=+37.106428325 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33fcaecf-093b-4a6b-9bed-0310951d4825-cert") pod "ingress-canary-kn8hz" (UID: "33fcaecf-093b-4a6b-9bed-0310951d4825") : secret "canary-serving-cert" not found Apr 22 19:24:03.213331 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:03.213300 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-c5cb4"] Apr 22 19:24:03.236237 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:03.236208 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-c5cb4"] Apr 22 19:24:03.236385 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:03.236335 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-c5cb4" Apr 22 19:24:03.239133 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:03.239075 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 19:24:03.239341 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:03.239320 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 19:24:03.240379 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:03.240359 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-p5mzx\"" Apr 22 19:24:03.240491 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:03.240453 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 19:24:03.240576 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:03.240560 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 19:24:03.390760 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:03.390722 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4425d89c-2d75-40bb-90fd-74877683a094-signing-key\") pod \"service-ca-865cb79987-c5cb4\" (UID: \"4425d89c-2d75-40bb-90fd-74877683a094\") " pod="openshift-service-ca/service-ca-865cb79987-c5cb4" Apr 22 19:24:03.390760 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:03.390755 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4425d89c-2d75-40bb-90fd-74877683a094-signing-cabundle\") pod \"service-ca-865cb79987-c5cb4\" (UID: \"4425d89c-2d75-40bb-90fd-74877683a094\") " pod="openshift-service-ca/service-ca-865cb79987-c5cb4" Apr 22 19:24:03.391194 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:03.390847 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pswpb\" (UniqueName: \"kubernetes.io/projected/4425d89c-2d75-40bb-90fd-74877683a094-kube-api-access-pswpb\") pod \"service-ca-865cb79987-c5cb4\" (UID: \"4425d89c-2d75-40bb-90fd-74877683a094\") " pod="openshift-service-ca/service-ca-865cb79987-c5cb4" Apr 22 19:24:03.491323 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:03.491233 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pswpb\" (UniqueName: \"kubernetes.io/projected/4425d89c-2d75-40bb-90fd-74877683a094-kube-api-access-pswpb\") pod \"service-ca-865cb79987-c5cb4\" (UID: \"4425d89c-2d75-40bb-90fd-74877683a094\") " pod="openshift-service-ca/service-ca-865cb79987-c5cb4" Apr 22 19:24:03.491472 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:03.491337 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4425d89c-2d75-40bb-90fd-74877683a094-signing-key\") pod \"service-ca-865cb79987-c5cb4\" (UID: \"4425d89c-2d75-40bb-90fd-74877683a094\") " pod="openshift-service-ca/service-ca-865cb79987-c5cb4" Apr 22 19:24:03.491472 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:03.491371 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4425d89c-2d75-40bb-90fd-74877683a094-signing-cabundle\") pod \"service-ca-865cb79987-c5cb4\" (UID: \"4425d89c-2d75-40bb-90fd-74877683a094\") " pod="openshift-service-ca/service-ca-865cb79987-c5cb4" Apr 22 19:24:03.492561 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:03.492534 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4425d89c-2d75-40bb-90fd-74877683a094-signing-cabundle\") pod \"service-ca-865cb79987-c5cb4\" (UID: \"4425d89c-2d75-40bb-90fd-74877683a094\") " pod="openshift-service-ca/service-ca-865cb79987-c5cb4" Apr 22 19:24:03.496999 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:03.496895 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4425d89c-2d75-40bb-90fd-74877683a094-signing-key\") pod \"service-ca-865cb79987-c5cb4\" (UID: \"4425d89c-2d75-40bb-90fd-74877683a094\") " pod="openshift-service-ca/service-ca-865cb79987-c5cb4" Apr 22 19:24:03.503161 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:03.503136 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pswpb\" (UniqueName: \"kubernetes.io/projected/4425d89c-2d75-40bb-90fd-74877683a094-kube-api-access-pswpb\") pod \"service-ca-865cb79987-c5cb4\" (UID: \"4425d89c-2d75-40bb-90fd-74877683a094\") " pod="openshift-service-ca/service-ca-865cb79987-c5cb4" Apr 22 19:24:03.549004 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:03.548968 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-c5cb4" Apr 22 19:24:03.584558 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:03.584513 2574 generic.go:358] "Generic (PLEG): container finished" podID="b948da6e-8c3e-4892-92f1-4f59d7c5c885" containerID="493b80d1546d02bd7b5641d031663df69c73458dd337c71836d07c43a166452e" exitCode=0 Apr 22 19:24:03.584706 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:03.584590 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcctc" event={"ID":"b948da6e-8c3e-4892-92f1-4f59d7c5c885","Type":"ContainerDied","Data":"493b80d1546d02bd7b5641d031663df69c73458dd337c71836d07c43a166452e"} Apr 22 19:24:03.955700 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:03.955651 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-c5cb4"] Apr 22 19:24:04.023832 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:24:04.023782 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4425d89c_2d75_40bb_90fd_74877683a094.slice/crio-267d0f102c668eadf80a1043a0aca6e8c5316637a0effbd1211dea0daf4eb19c WatchSource:0}: Error finding container 267d0f102c668eadf80a1043a0aca6e8c5316637a0effbd1211dea0daf4eb19c: Status 404 returned error can't find the container with id 267d0f102c668eadf80a1043a0aca6e8c5316637a0effbd1211dea0daf4eb19c Apr 22 19:24:04.589964 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:04.589222 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d2sdz" event={"ID":"80f993c4-91be-4779-90b0-4d41d6f29f4e","Type":"ContainerStarted","Data":"81ae9c1b75b475fcf634d2f811ab14f77538238595fc568bc46091c3c0502cd8"} Apr 22 19:24:04.589964 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:04.589503 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d2sdz" event={"ID":"80f993c4-91be-4779-90b0-4d41d6f29f4e","Type":"ContainerStarted","Data":"dce2222df31b649187f50530a2062f737021e83644943f4720dec165623669a4"} Apr 22 19:24:04.590938 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:04.590907 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-c5cb4" event={"ID":"4425d89c-2d75-40bb-90fd-74877683a094","Type":"ContainerStarted","Data":"267d0f102c668eadf80a1043a0aca6e8c5316637a0effbd1211dea0daf4eb19c"} Apr 22 19:24:04.595139 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:04.595080 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcctc" event={"ID":"b948da6e-8c3e-4892-92f1-4f59d7c5c885","Type":"ContainerStarted","Data":"2ec0a4acf3df468c60c0fc3ec9c7178755aca588f145ddd74725202b82f3f34e"} Apr 22 19:24:04.612438 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:04.612374 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d2sdz" podStartSLOduration=1.756460423 podStartE2EDuration="3.612354181s" podCreationTimestamp="2026-04-22 19:24:01 +0000 UTC" firstStartedPulling="2026-04-22 19:24:02.182219186 +0000 UTC m=+34.301598477" lastFinishedPulling="2026-04-22 19:24:04.038112933 +0000 UTC m=+36.157492235" observedRunningTime="2026-04-22 19:24:04.611919431 +0000 UTC m=+36.731298741" watchObservedRunningTime="2026-04-22 19:24:04.612354181 +0000 UTC m=+36.731733490" Apr 22 19:24:04.643706 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:04.643454 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-dcctc" podStartSLOduration=5.671088955 podStartE2EDuration="36.64342097s" podCreationTimestamp="2026-04-22 19:23:28 +0000 UTC" firstStartedPulling="2026-04-22 19:23:31.122908576 +0000 UTC m=+3.242287872" lastFinishedPulling="2026-04-22 19:24:02.095240597 +0000 UTC m=+34.214619887" observedRunningTime="2026-04-22 19:24:04.64311665 +0000 UTC m=+36.762495960" watchObservedRunningTime="2026-04-22 19:24:04.64342097 +0000 UTC m=+36.762800279" Apr 22 19:24:05.004182 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:05.004131 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33fcaecf-093b-4a6b-9bed-0310951d4825-cert\") pod \"ingress-canary-kn8hz\" (UID: \"33fcaecf-093b-4a6b-9bed-0310951d4825\") " pod="openshift-ingress-canary/ingress-canary-kn8hz" Apr 22 19:24:05.004401 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:05.004217 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-tls\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:05.004401 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:05.004272 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e544a780-e5f6-409b-b57c-0e80f0766bb1-metrics-tls\") pod \"dns-default-jb2x5\" (UID: \"e544a780-e5f6-409b-b57c-0e80f0766bb1\") " pod="openshift-dns/dns-default-jb2x5" Apr 22 19:24:05.004401 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:05.004306 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:05.004401 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:05.004380 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33fcaecf-093b-4a6b-9bed-0310951d4825-cert podName:33fcaecf-093b-4a6b-9bed-0310951d4825 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:09.004357055 +0000 UTC m=+41.123736370 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33fcaecf-093b-4a6b-9bed-0310951d4825-cert") pod "ingress-canary-kn8hz" (UID: "33fcaecf-093b-4a6b-9bed-0310951d4825") : secret "canary-serving-cert" not found Apr 22 19:24:05.004401 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:05.004398 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:05.004709 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:05.004442 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e544a780-e5f6-409b-b57c-0e80f0766bb1-metrics-tls podName:e544a780-e5f6-409b-b57c-0e80f0766bb1 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:09.004429295 +0000 UTC m=+41.123808582 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e544a780-e5f6-409b-b57c-0e80f0766bb1-metrics-tls") pod "dns-default-jb2x5" (UID: "e544a780-e5f6-409b-b57c-0e80f0766bb1") : secret "dns-default-metrics-tls" not found Apr 22 19:24:05.004709 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:05.004507 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:05.004709 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:05.004519 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67c7f85b4c-gsf88: secret "image-registry-tls" not found Apr 22 19:24:05.004709 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:05.004551 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-tls podName:fd234ae5-2ef2-482b-9874-6902bc15a04a nodeName:}" failed. No retries permitted until 2026-04-22 19:24:09.004540193 +0000 UTC m=+41.123919482 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-tls") pod "image-registry-67c7f85b4c-gsf88" (UID: "fd234ae5-2ef2-482b-9874-6902bc15a04a") : secret "image-registry-tls" not found Apr 22 19:24:07.602141 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:07.602107 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bllx4" event={"ID":"cbfd8869-819e-45c7-9536-08c72a48f2c3","Type":"ContainerStarted","Data":"7f1b9c2cac6e1ecb742550af6b1b478f2fe2f96c0f0e0c684b21b2bdd5bd711e"} Apr 22 19:24:07.602604 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:07.602214 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:24:07.603473 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:07.603450 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-c5cb4" event={"ID":"4425d89c-2d75-40bb-90fd-74877683a094","Type":"ContainerStarted","Data":"f19e8d08eefbe48874513a0a20980ed660954e68cbdf8fd40689d5d552f5978e"} Apr 22 19:24:07.619002 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:07.618954 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bllx4" podStartSLOduration=35.534711475 podStartE2EDuration="39.618942966s" podCreationTimestamp="2026-04-22 19:23:28 +0000 UTC" firstStartedPulling="2026-04-22 19:24:02.472743517 +0000 UTC m=+34.592122804" lastFinishedPulling="2026-04-22 19:24:06.556974994 +0000 UTC m=+38.676354295" observedRunningTime="2026-04-22 19:24:07.618761194 +0000 UTC m=+39.738140514" watchObservedRunningTime="2026-04-22 19:24:07.618942966 +0000 UTC m=+39.738322275" Apr 22 19:24:07.641169 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:07.641125 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-c5cb4" podStartSLOduration=2.111568791 podStartE2EDuration="4.641112784s" podCreationTimestamp="2026-04-22 19:24:03 +0000 UTC" firstStartedPulling="2026-04-22 19:24:04.03261072 +0000 UTC m=+36.151990007" lastFinishedPulling="2026-04-22 19:24:06.562154708 +0000 UTC m=+38.681534000" observedRunningTime="2026-04-22 19:24:07.64042296 +0000 UTC m=+39.759802325" watchObservedRunningTime="2026-04-22 19:24:07.641112784 +0000 UTC m=+39.760492093" Apr 22 19:24:09.036959 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:09.036918 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-tls\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:09.037405 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:09.036994 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e544a780-e5f6-409b-b57c-0e80f0766bb1-metrics-tls\") pod \"dns-default-jb2x5\" (UID: \"e544a780-e5f6-409b-b57c-0e80f0766bb1\") " pod="openshift-dns/dns-default-jb2x5" Apr 22 19:24:09.037405 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:09.037050 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33fcaecf-093b-4a6b-9bed-0310951d4825-cert\") pod \"ingress-canary-kn8hz\" (UID: \"33fcaecf-093b-4a6b-9bed-0310951d4825\") " pod="openshift-ingress-canary/ingress-canary-kn8hz" Apr 22 19:24:09.037405 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:09.037099 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:09.037405 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:09.037127 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67c7f85b4c-gsf88: secret "image-registry-tls" not found Apr 22 19:24:09.037405 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:09.037149 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:09.037405 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:09.037157 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:09.037405 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:09.037193 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-tls podName:fd234ae5-2ef2-482b-9874-6902bc15a04a nodeName:}" failed. No retries permitted until 2026-04-22 19:24:17.03717121 +0000 UTC m=+49.156550502 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-tls") pod "image-registry-67c7f85b4c-gsf88" (UID: "fd234ae5-2ef2-482b-9874-6902bc15a04a") : secret "image-registry-tls" not found Apr 22 19:24:09.037405 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:09.037215 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33fcaecf-093b-4a6b-9bed-0310951d4825-cert podName:33fcaecf-093b-4a6b-9bed-0310951d4825 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:17.037203195 +0000 UTC m=+49.156582490 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33fcaecf-093b-4a6b-9bed-0310951d4825-cert") pod "ingress-canary-kn8hz" (UID: "33fcaecf-093b-4a6b-9bed-0310951d4825") : secret "canary-serving-cert" not found Apr 22 19:24:09.037405 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:09.037232 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e544a780-e5f6-409b-b57c-0e80f0766bb1-metrics-tls podName:e544a780-e5f6-409b-b57c-0e80f0766bb1 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:17.037223334 +0000 UTC m=+49.156602625 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e544a780-e5f6-409b-b57c-0e80f0766bb1-metrics-tls") pod "dns-default-jb2x5" (UID: "e544a780-e5f6-409b-b57c-0e80f0766bb1") : secret "dns-default-metrics-tls" not found Apr 22 19:24:14.373980 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:14.373944 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/793d5ba1-c977-4404-bd38-8b78e8e5e191-original-pull-secret\") pod \"global-pull-secret-syncer-k48jx\" (UID: \"793d5ba1-c977-4404-bd38-8b78e8e5e191\") " pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:24:14.377245 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:14.377222 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/793d5ba1-c977-4404-bd38-8b78e8e5e191-original-pull-secret\") pod \"global-pull-secret-syncer-k48jx\" (UID: \"793d5ba1-c977-4404-bd38-8b78e8e5e191\") " pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:24:14.638269 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:14.638182 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k48jx" Apr 22 19:24:14.755541 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:14.755505 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-k48jx"] Apr 22 19:24:14.758339 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:24:14.758309 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod793d5ba1_c977_4404_bd38_8b78e8e5e191.slice/crio-3d19d1cafa653886ccfab95f76204764e559d15734975b3390ba965e6d939c71 WatchSource:0}: Error finding container 3d19d1cafa653886ccfab95f76204764e559d15734975b3390ba965e6d939c71: Status 404 returned error can't find the container with id 3d19d1cafa653886ccfab95f76204764e559d15734975b3390ba965e6d939c71 Apr 22 19:24:15.620211 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:15.620171 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-k48jx" event={"ID":"793d5ba1-c977-4404-bd38-8b78e8e5e191","Type":"ContainerStarted","Data":"3d19d1cafa653886ccfab95f76204764e559d15734975b3390ba965e6d939c71"} Apr 22 19:24:17.098655 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:17.098617 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-tls\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:17.099324 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:17.098677 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e544a780-e5f6-409b-b57c-0e80f0766bb1-metrics-tls\") pod \"dns-default-jb2x5\" (UID: \"e544a780-e5f6-409b-b57c-0e80f0766bb1\") " pod="openshift-dns/dns-default-jb2x5" Apr 22 19:24:17.099324 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:17.098709 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33fcaecf-093b-4a6b-9bed-0310951d4825-cert\") pod \"ingress-canary-kn8hz\" (UID: \"33fcaecf-093b-4a6b-9bed-0310951d4825\") " pod="openshift-ingress-canary/ingress-canary-kn8hz" Apr 22 19:24:17.101365 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:17.101301 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33fcaecf-093b-4a6b-9bed-0310951d4825-cert\") pod \"ingress-canary-kn8hz\" (UID: \"33fcaecf-093b-4a6b-9bed-0310951d4825\") " pod="openshift-ingress-canary/ingress-canary-kn8hz" Apr 22 19:24:17.101496 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:17.101400 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e544a780-e5f6-409b-b57c-0e80f0766bb1-metrics-tls\") pod \"dns-default-jb2x5\" (UID: \"e544a780-e5f6-409b-b57c-0e80f0766bb1\") " pod="openshift-dns/dns-default-jb2x5" Apr 22 19:24:17.101753 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:17.101729 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-tls\") pod \"image-registry-67c7f85b4c-gsf88\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:17.177036 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:17.176972 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:17.187855 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:17.187822 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jb2x5" Apr 22 19:24:17.195669 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:17.195623 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kn8hz" Apr 22 19:24:18.673029 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:18.672998 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jb2x5"] Apr 22 19:24:18.685037 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:18.685009 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kn8hz"] Apr 22 19:24:18.706100 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:18.705976 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-67c7f85b4c-gsf88"] Apr 22 19:24:18.841318 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:24:18.841222 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode544a780_e5f6_409b_b57c_0e80f0766bb1.slice/crio-89e1eef3905b76b15f2dd9c7a462b3514726078c2e04e50c6bbe147214334237 WatchSource:0}: Error finding container 89e1eef3905b76b15f2dd9c7a462b3514726078c2e04e50c6bbe147214334237: Status 404 returned error can't find the container with id 89e1eef3905b76b15f2dd9c7a462b3514726078c2e04e50c6bbe147214334237 Apr 22 19:24:18.842270 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:24:18.842245 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33fcaecf_093b_4a6b_9bed_0310951d4825.slice/crio-8b2668e85dc4d3fd26b86b60a32085c1f2e7932030946cbaefcae06931bb08cb WatchSource:0}: Error finding container 8b2668e85dc4d3fd26b86b60a32085c1f2e7932030946cbaefcae06931bb08cb: Status 404 returned error can't find the container with id 8b2668e85dc4d3fd26b86b60a32085c1f2e7932030946cbaefcae06931bb08cb Apr 22 19:24:18.843097 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:24:18.843053 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd234ae5_2ef2_482b_9874_6902bc15a04a.slice/crio-c14db7660b8d4d253dad68f888843ba4a3b1b5c93f84b4a017beffd23d067872 WatchSource:0}: Error finding container c14db7660b8d4d253dad68f888843ba4a3b1b5c93f84b4a017beffd23d067872: Status 404 returned error can't find the container with id c14db7660b8d4d253dad68f888843ba4a3b1b5c93f84b4a017beffd23d067872 Apr 22 19:24:19.633355 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:19.633304 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jb2x5" event={"ID":"e544a780-e5f6-409b-b57c-0e80f0766bb1","Type":"ContainerStarted","Data":"89e1eef3905b76b15f2dd9c7a462b3514726078c2e04e50c6bbe147214334237"} Apr 22 19:24:19.635084 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:19.635038 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-k48jx" event={"ID":"793d5ba1-c977-4404-bd38-8b78e8e5e191","Type":"ContainerStarted","Data":"b55d0c5aaefa9e032e72c122233941cae145b18965684abf4657f356cbdb4f51"} Apr 22 19:24:19.636793 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:19.636694 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" event={"ID":"fd234ae5-2ef2-482b-9874-6902bc15a04a","Type":"ContainerStarted","Data":"3680d2a17f5e74fb626a62bff67d23ddb62930b27aaadb11063ea04e57e2ab9c"} Apr 22 19:24:19.636793 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:19.636728 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" event={"ID":"fd234ae5-2ef2-482b-9874-6902bc15a04a","Type":"ContainerStarted","Data":"c14db7660b8d4d253dad68f888843ba4a3b1b5c93f84b4a017beffd23d067872"} Apr 22 19:24:19.637191 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:19.637148 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:19.638110 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:19.638074 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kn8hz" event={"ID":"33fcaecf-093b-4a6b-9bed-0310951d4825","Type":"ContainerStarted","Data":"8b2668e85dc4d3fd26b86b60a32085c1f2e7932030946cbaefcae06931bb08cb"} Apr 22 19:24:19.651758 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:19.651705 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-k48jx" podStartSLOduration=33.531859095 podStartE2EDuration="37.651687543s" podCreationTimestamp="2026-04-22 19:23:42 +0000 UTC" firstStartedPulling="2026-04-22 19:24:14.760065959 +0000 UTC m=+46.879445249" lastFinishedPulling="2026-04-22 19:24:18.87989441 +0000 UTC m=+50.999273697" observedRunningTime="2026-04-22 19:24:19.651132794 +0000 UTC m=+51.770512106" watchObservedRunningTime="2026-04-22 19:24:19.651687543 +0000 UTC m=+51.771066853" Apr 22 19:24:19.669605 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:19.669549 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" podStartSLOduration=37.669531141 podStartE2EDuration="37.669531141s" podCreationTimestamp="2026-04-22 19:23:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:24:19.669104607 +0000 UTC m=+51.788483917" watchObservedRunningTime="2026-04-22 19:24:19.669531141 +0000 UTC m=+51.788910451" Apr 22 19:24:21.645345 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:21.645305 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kn8hz" event={"ID":"33fcaecf-093b-4a6b-9bed-0310951d4825","Type":"ContainerStarted","Data":"35dc7f8a99fd3ffa39297044dad7784485f3b3f296a9cb39086a9df2e7c4ab2c"} Apr 22 19:24:21.646708 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:21.646676 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jb2x5" event={"ID":"e544a780-e5f6-409b-b57c-0e80f0766bb1","Type":"ContainerStarted","Data":"d4548eceeadf05128191f6d91ade46d1bf91f0be0d98f63d0f70b7bec9cb8c72"} Apr 22 19:24:21.671043 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:21.670962 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kn8hz" podStartSLOduration=18.103365038 podStartE2EDuration="20.670945349s" podCreationTimestamp="2026-04-22 19:24:01 +0000 UTC" firstStartedPulling="2026-04-22 19:24:18.8676107 +0000 UTC m=+50.986989991" lastFinishedPulling="2026-04-22 19:24:21.435191 +0000 UTC m=+53.554570302" observedRunningTime="2026-04-22 19:24:21.669482278 +0000 UTC m=+53.788861586" watchObservedRunningTime="2026-04-22 19:24:21.670945349 +0000 UTC m=+53.790324658" Apr 22 19:24:22.650820 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:22.650771 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jb2x5" event={"ID":"e544a780-e5f6-409b-b57c-0e80f0766bb1","Type":"ContainerStarted","Data":"069de45095583105f0b4b8a399da3e1a8f3af4c0896f8456c7be36783afd2823"} Apr 22 19:24:22.669436 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:22.669389 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jb2x5" podStartSLOduration=19.106097561 podStartE2EDuration="21.66937548s" podCreationTimestamp="2026-04-22 19:24:01 +0000 UTC" firstStartedPulling="2026-04-22 19:24:18.867516453 +0000 UTC m=+50.986895743" lastFinishedPulling="2026-04-22 19:24:21.430794375 +0000 UTC m=+53.550173662" observedRunningTime="2026-04-22 19:24:22.669015495 +0000 UTC m=+54.788394804" watchObservedRunningTime="2026-04-22 19:24:22.66937548 +0000 UTC m=+54.788754789" Apr 22 19:24:23.653128 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:23.653100 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-jb2x5" Apr 22 19:24:25.399348 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.399313 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9t2qh"] Apr 22 19:24:25.404230 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.404213 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9t2qh" Apr 22 19:24:25.407508 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.407485 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 19:24:25.408819 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.408785 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 19:24:25.408908 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.408820 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 19:24:25.408908 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.408866 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-pwghq\"" Apr 22 19:24:25.409020 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.408920 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 19:24:25.416056 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.416036 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9t2qh"] Apr 22 19:24:25.457009 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.456981 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/80b67f15-b534-4bb5-98c8-6566228be090-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9t2qh\" (UID: \"80b67f15-b534-4bb5-98c8-6566228be090\") " pod="openshift-insights/insights-runtime-extractor-9t2qh" Apr 22 19:24:25.457153 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.457020 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/80b67f15-b534-4bb5-98c8-6566228be090-data-volume\") pod \"insights-runtime-extractor-9t2qh\" (UID: \"80b67f15-b534-4bb5-98c8-6566228be090\") " pod="openshift-insights/insights-runtime-extractor-9t2qh" Apr 22 19:24:25.457153 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.457056 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/80b67f15-b534-4bb5-98c8-6566228be090-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9t2qh\" (UID: \"80b67f15-b534-4bb5-98c8-6566228be090\") " pod="openshift-insights/insights-runtime-extractor-9t2qh" Apr 22 19:24:25.457153 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.457099 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/80b67f15-b534-4bb5-98c8-6566228be090-crio-socket\") pod \"insights-runtime-extractor-9t2qh\" (UID: \"80b67f15-b534-4bb5-98c8-6566228be090\") " pod="openshift-insights/insights-runtime-extractor-9t2qh" Apr 22 19:24:25.457153 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.457137 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7qp9\" (UniqueName: \"kubernetes.io/projected/80b67f15-b534-4bb5-98c8-6566228be090-kube-api-access-b7qp9\") pod \"insights-runtime-extractor-9t2qh\" (UID: \"80b67f15-b534-4bb5-98c8-6566228be090\") " pod="openshift-insights/insights-runtime-extractor-9t2qh" Apr 22 19:24:25.460384 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.460357 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-67c7f85b4c-gsf88"] Apr 22 19:24:25.498768 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.498740 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-chhcp"] Apr 22 19:24:25.501994 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.501975 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-chhcp" Apr 22 19:24:25.504376 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.504352 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-9s6kb\"" Apr 22 19:24:25.504482 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.504402 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 19:24:25.504600 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.504587 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 19:24:25.514474 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.514452 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-chhcp"] Apr 22 19:24:25.558244 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.558218 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/80b67f15-b534-4bb5-98c8-6566228be090-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9t2qh\" (UID: \"80b67f15-b534-4bb5-98c8-6566228be090\") " pod="openshift-insights/insights-runtime-extractor-9t2qh" Apr 22 19:24:25.558364 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.558250 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m29df\" (UniqueName: \"kubernetes.io/projected/9656e049-2948-47bd-aec9-0bf4e3612f24-kube-api-access-m29df\") pod \"downloads-6bcc868b7-chhcp\" (UID: \"9656e049-2948-47bd-aec9-0bf4e3612f24\") " pod="openshift-console/downloads-6bcc868b7-chhcp" Apr 22 19:24:25.558364 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.558285 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/80b67f15-b534-4bb5-98c8-6566228be090-data-volume\") pod \"insights-runtime-extractor-9t2qh\" (UID: \"80b67f15-b534-4bb5-98c8-6566228be090\") " pod="openshift-insights/insights-runtime-extractor-9t2qh" Apr 22 19:24:25.558364 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.558318 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/80b67f15-b534-4bb5-98c8-6566228be090-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9t2qh\" (UID: \"80b67f15-b534-4bb5-98c8-6566228be090\") " pod="openshift-insights/insights-runtime-extractor-9t2qh" Apr 22 19:24:25.558500 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.558481 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/80b67f15-b534-4bb5-98c8-6566228be090-crio-socket\") pod \"insights-runtime-extractor-9t2qh\" (UID: \"80b67f15-b534-4bb5-98c8-6566228be090\") " pod="openshift-insights/insights-runtime-extractor-9t2qh" Apr 22 19:24:25.558534 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.558516 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7qp9\" (UniqueName: \"kubernetes.io/projected/80b67f15-b534-4bb5-98c8-6566228be090-kube-api-access-b7qp9\") pod \"insights-runtime-extractor-9t2qh\" (UID: \"80b67f15-b534-4bb5-98c8-6566228be090\") " pod="openshift-insights/insights-runtime-extractor-9t2qh" Apr 22 19:24:25.558702 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.558680 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/80b67f15-b534-4bb5-98c8-6566228be090-data-volume\") pod \"insights-runtime-extractor-9t2qh\" (UID: \"80b67f15-b534-4bb5-98c8-6566228be090\") " pod="openshift-insights/insights-runtime-extractor-9t2qh" Apr 22 19:24:25.558864 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.558703 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/80b67f15-b534-4bb5-98c8-6566228be090-crio-socket\") pod \"insights-runtime-extractor-9t2qh\" (UID: \"80b67f15-b534-4bb5-98c8-6566228be090\") " pod="openshift-insights/insights-runtime-extractor-9t2qh" Apr 22 19:24:25.559006 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.558988 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/80b67f15-b534-4bb5-98c8-6566228be090-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9t2qh\" (UID: \"80b67f15-b534-4bb5-98c8-6566228be090\") " pod="openshift-insights/insights-runtime-extractor-9t2qh" Apr 22 19:24:25.560548 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.560529 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/80b67f15-b534-4bb5-98c8-6566228be090-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9t2qh\" (UID: \"80b67f15-b534-4bb5-98c8-6566228be090\") " pod="openshift-insights/insights-runtime-extractor-9t2qh" Apr 22 19:24:25.567233 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.567213 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7qp9\" (UniqueName: \"kubernetes.io/projected/80b67f15-b534-4bb5-98c8-6566228be090-kube-api-access-b7qp9\") pod \"insights-runtime-extractor-9t2qh\" (UID: \"80b67f15-b534-4bb5-98c8-6566228be090\") " pod="openshift-insights/insights-runtime-extractor-9t2qh" Apr 22 19:24:25.596935 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.596910 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-694bf65ddb-9n9qb"] Apr 22 19:24:25.601368 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.601351 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.611952 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.611931 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-694bf65ddb-9n9qb"] Apr 22 19:24:25.658959 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.658869 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzr5b\" (UniqueName: \"kubernetes.io/projected/1cc7575c-4a73-4478-a11a-0933bcca8694-kube-api-access-pzr5b\") pod \"image-registry-694bf65ddb-9n9qb\" (UID: \"1cc7575c-4a73-4478-a11a-0933bcca8694\") " pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.658959 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.658904 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1cc7575c-4a73-4478-a11a-0933bcca8694-bound-sa-token\") pod \"image-registry-694bf65ddb-9n9qb\" (UID: \"1cc7575c-4a73-4478-a11a-0933bcca8694\") " pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.658959 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.658931 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1cc7575c-4a73-4478-a11a-0933bcca8694-trusted-ca\") pod \"image-registry-694bf65ddb-9n9qb\" (UID: \"1cc7575c-4a73-4478-a11a-0933bcca8694\") " pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.658959 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.658948 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1cc7575c-4a73-4478-a11a-0933bcca8694-installation-pull-secrets\") pod \"image-registry-694bf65ddb-9n9qb\" (UID: \"1cc7575c-4a73-4478-a11a-0933bcca8694\") " pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.659260 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.658996 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m29df\" (UniqueName: \"kubernetes.io/projected/9656e049-2948-47bd-aec9-0bf4e3612f24-kube-api-access-m29df\") pod \"downloads-6bcc868b7-chhcp\" (UID: \"9656e049-2948-47bd-aec9-0bf4e3612f24\") " pod="openshift-console/downloads-6bcc868b7-chhcp" Apr 22 19:24:25.659260 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.659021 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1cc7575c-4a73-4478-a11a-0933bcca8694-image-registry-private-configuration\") pod \"image-registry-694bf65ddb-9n9qb\" (UID: \"1cc7575c-4a73-4478-a11a-0933bcca8694\") " pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.659260 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.659041 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1cc7575c-4a73-4478-a11a-0933bcca8694-registry-tls\") pod \"image-registry-694bf65ddb-9n9qb\" (UID: \"1cc7575c-4a73-4478-a11a-0933bcca8694\") " pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.659260 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.659060 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1cc7575c-4a73-4478-a11a-0933bcca8694-ca-trust-extracted\") pod \"image-registry-694bf65ddb-9n9qb\" (UID: \"1cc7575c-4a73-4478-a11a-0933bcca8694\") " pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.659260 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.659085 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1cc7575c-4a73-4478-a11a-0933bcca8694-registry-certificates\") pod \"image-registry-694bf65ddb-9n9qb\" (UID: \"1cc7575c-4a73-4478-a11a-0933bcca8694\") " pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.667986 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.667954 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m29df\" (UniqueName: \"kubernetes.io/projected/9656e049-2948-47bd-aec9-0bf4e3612f24-kube-api-access-m29df\") pod \"downloads-6bcc868b7-chhcp\" (UID: \"9656e049-2948-47bd-aec9-0bf4e3612f24\") " pod="openshift-console/downloads-6bcc868b7-chhcp" Apr 22 19:24:25.713057 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.713022 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9t2qh" Apr 22 19:24:25.760105 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.760075 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzr5b\" (UniqueName: \"kubernetes.io/projected/1cc7575c-4a73-4478-a11a-0933bcca8694-kube-api-access-pzr5b\") pod \"image-registry-694bf65ddb-9n9qb\" (UID: \"1cc7575c-4a73-4478-a11a-0933bcca8694\") " pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.760252 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.760114 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1cc7575c-4a73-4478-a11a-0933bcca8694-bound-sa-token\") pod \"image-registry-694bf65ddb-9n9qb\" (UID: \"1cc7575c-4a73-4478-a11a-0933bcca8694\") " pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.760252 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.760145 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1cc7575c-4a73-4478-a11a-0933bcca8694-trusted-ca\") pod \"image-registry-694bf65ddb-9n9qb\" (UID: \"1cc7575c-4a73-4478-a11a-0933bcca8694\") " pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.760252 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.760168 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1cc7575c-4a73-4478-a11a-0933bcca8694-installation-pull-secrets\") pod \"image-registry-694bf65ddb-9n9qb\" (UID: \"1cc7575c-4a73-4478-a11a-0933bcca8694\") " pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.760252 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.760233 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1cc7575c-4a73-4478-a11a-0933bcca8694-image-registry-private-configuration\") pod \"image-registry-694bf65ddb-9n9qb\" (UID: \"1cc7575c-4a73-4478-a11a-0933bcca8694\") " pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.760457 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.760261 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1cc7575c-4a73-4478-a11a-0933bcca8694-registry-tls\") pod \"image-registry-694bf65ddb-9n9qb\" (UID: \"1cc7575c-4a73-4478-a11a-0933bcca8694\") " pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.760457 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.760287 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1cc7575c-4a73-4478-a11a-0933bcca8694-ca-trust-extracted\") pod \"image-registry-694bf65ddb-9n9qb\" (UID: \"1cc7575c-4a73-4478-a11a-0933bcca8694\") " pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.760457 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.760308 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1cc7575c-4a73-4478-a11a-0933bcca8694-registry-certificates\") pod \"image-registry-694bf65ddb-9n9qb\" (UID: \"1cc7575c-4a73-4478-a11a-0933bcca8694\") " pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.761303 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.761252 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1cc7575c-4a73-4478-a11a-0933bcca8694-registry-certificates\") pod \"image-registry-694bf65ddb-9n9qb\" (UID: \"1cc7575c-4a73-4478-a11a-0933bcca8694\") " pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.761497 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.761462 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1cc7575c-4a73-4478-a11a-0933bcca8694-ca-trust-extracted\") pod \"image-registry-694bf65ddb-9n9qb\" (UID: \"1cc7575c-4a73-4478-a11a-0933bcca8694\") " pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.761669 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.761642 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1cc7575c-4a73-4478-a11a-0933bcca8694-trusted-ca\") pod \"image-registry-694bf65ddb-9n9qb\" (UID: \"1cc7575c-4a73-4478-a11a-0933bcca8694\") " pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.763796 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.763760 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1cc7575c-4a73-4478-a11a-0933bcca8694-image-registry-private-configuration\") pod \"image-registry-694bf65ddb-9n9qb\" (UID: \"1cc7575c-4a73-4478-a11a-0933bcca8694\") " pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.763796 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.763770 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1cc7575c-4a73-4478-a11a-0933bcca8694-installation-pull-secrets\") pod \"image-registry-694bf65ddb-9n9qb\" (UID: \"1cc7575c-4a73-4478-a11a-0933bcca8694\") " pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.764500 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.764481 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1cc7575c-4a73-4478-a11a-0933bcca8694-registry-tls\") pod \"image-registry-694bf65ddb-9n9qb\" (UID: \"1cc7575c-4a73-4478-a11a-0933bcca8694\") " pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.770741 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.770715 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1cc7575c-4a73-4478-a11a-0933bcca8694-bound-sa-token\") pod \"image-registry-694bf65ddb-9n9qb\" (UID: \"1cc7575c-4a73-4478-a11a-0933bcca8694\") " pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.771140 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.771097 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzr5b\" (UniqueName: \"kubernetes.io/projected/1cc7575c-4a73-4478-a11a-0933bcca8694-kube-api-access-pzr5b\") pod \"image-registry-694bf65ddb-9n9qb\" (UID: \"1cc7575c-4a73-4478-a11a-0933bcca8694\") " pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.810275 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.810246 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-chhcp" Apr 22 19:24:25.835935 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.835907 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9t2qh"] Apr 22 19:24:25.838945 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:24:25.838903 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80b67f15_b534_4bb5_98c8_6566228be090.slice/crio-a56b118d16a4ead5e733909bc608c95d7f7582ff44112615c69c8b71f5ff04d8 WatchSource:0}: Error finding container a56b118d16a4ead5e733909bc608c95d7f7582ff44112615c69c8b71f5ff04d8: Status 404 returned error can't find the container with id a56b118d16a4ead5e733909bc608c95d7f7582ff44112615c69c8b71f5ff04d8 Apr 22 19:24:25.909393 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.909336 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:25.930375 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:25.930344 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-chhcp"] Apr 22 19:24:26.031567 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:26.031531 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-694bf65ddb-9n9qb"] Apr 22 19:24:26.034297 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:24:26.034271 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cc7575c_4a73_4478_a11a_0933bcca8694.slice/crio-f98dc6b50f66605bf6a74ab17b411e666128f33a72f75afe160e0edb6639e7b4 WatchSource:0}: Error finding container f98dc6b50f66605bf6a74ab17b411e666128f33a72f75afe160e0edb6639e7b4: Status 404 returned error can't find the container with id f98dc6b50f66605bf6a74ab17b411e666128f33a72f75afe160e0edb6639e7b4 Apr 22 19:24:26.662388 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:26.662356 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9t2qh" event={"ID":"80b67f15-b534-4bb5-98c8-6566228be090","Type":"ContainerStarted","Data":"35bf3bab6835feb170d5cf113f65a9c65fa2ab7adabda82feefce2e393274354"} Apr 22 19:24:26.662795 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:26.662399 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9t2qh" event={"ID":"80b67f15-b534-4bb5-98c8-6566228be090","Type":"ContainerStarted","Data":"a56b118d16a4ead5e733909bc608c95d7f7582ff44112615c69c8b71f5ff04d8"} Apr 22 19:24:26.663944 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:26.663920 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" event={"ID":"1cc7575c-4a73-4478-a11a-0933bcca8694","Type":"ContainerStarted","Data":"b639188bba1d6d46cd8ac7e7a2259b7d178b3be18d65d01776e3dc2751a4a40e"} Apr 22 19:24:26.664059 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:26.663952 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" event={"ID":"1cc7575c-4a73-4478-a11a-0933bcca8694","Type":"ContainerStarted","Data":"f98dc6b50f66605bf6a74ab17b411e666128f33a72f75afe160e0edb6639e7b4"} Apr 22 19:24:26.664059 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:26.664003 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:26.664997 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:26.664976 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-chhcp" event={"ID":"9656e049-2948-47bd-aec9-0bf4e3612f24","Type":"ContainerStarted","Data":"f8cde29e2e89f7520f69bb436303f642444cb4a23b0e775f0e6ce47cc6e80199"} Apr 22 19:24:26.688815 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:26.688754 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" podStartSLOduration=1.6887388250000002 podStartE2EDuration="1.688738825s" podCreationTimestamp="2026-04-22 19:24:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:24:26.686748803 +0000 UTC m=+58.806128113" watchObservedRunningTime="2026-04-22 19:24:26.688738825 +0000 UTC m=+58.808118133" Apr 22 19:24:27.671055 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.671012 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9t2qh" event={"ID":"80b67f15-b534-4bb5-98c8-6566228be090","Type":"ContainerStarted","Data":"cca361c0d042080ddc354452cfaab170cb6723f542b1bff4f2462a9d12473250"} Apr 22 19:24:27.685060 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.685030 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7bd787488f-p94lz"] Apr 22 19:24:27.688385 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.688365 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:24:27.692480 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.692457 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 19:24:27.692607 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.692521 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 19:24:27.692607 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.692545 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 19:24:27.692607 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.692457 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-skskp\"" Apr 22 19:24:27.692607 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.692598 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 19:24:27.692935 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.692684 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 19:24:27.697581 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.697560 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bd787488f-p94lz"] Apr 22 19:24:27.775978 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.775938 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9c41abc-f805-4876-a5cb-d7e883e5b450-console-oauth-config\") pod \"console-7bd787488f-p94lz\" (UID: \"a9c41abc-f805-4876-a5cb-d7e883e5b450\") " pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:24:27.776240 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.776078 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9c41abc-f805-4876-a5cb-d7e883e5b450-console-config\") pod \"console-7bd787488f-p94lz\" (UID: \"a9c41abc-f805-4876-a5cb-d7e883e5b450\") " pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:24:27.776334 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.776313 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9c41abc-f805-4876-a5cb-d7e883e5b450-oauth-serving-cert\") pod \"console-7bd787488f-p94lz\" (UID: \"a9c41abc-f805-4876-a5cb-d7e883e5b450\") " pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:24:27.776402 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.776350 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kn2q\" (UniqueName: \"kubernetes.io/projected/a9c41abc-f805-4876-a5cb-d7e883e5b450-kube-api-access-8kn2q\") pod \"console-7bd787488f-p94lz\" (UID: \"a9c41abc-f805-4876-a5cb-d7e883e5b450\") " pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:24:27.776468 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.776406 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9c41abc-f805-4876-a5cb-d7e883e5b450-console-serving-cert\") pod \"console-7bd787488f-p94lz\" (UID: \"a9c41abc-f805-4876-a5cb-d7e883e5b450\") " pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:24:27.776468 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.776433 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9c41abc-f805-4876-a5cb-d7e883e5b450-service-ca\") pod \"console-7bd787488f-p94lz\" (UID: \"a9c41abc-f805-4876-a5cb-d7e883e5b450\") " pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:24:27.877838 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.877777 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9c41abc-f805-4876-a5cb-d7e883e5b450-console-config\") pod \"console-7bd787488f-p94lz\" (UID: \"a9c41abc-f805-4876-a5cb-d7e883e5b450\") " pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:24:27.878002 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.877869 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9c41abc-f805-4876-a5cb-d7e883e5b450-oauth-serving-cert\") pod \"console-7bd787488f-p94lz\" (UID: \"a9c41abc-f805-4876-a5cb-d7e883e5b450\") " pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:24:27.878002 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.877897 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kn2q\" (UniqueName: \"kubernetes.io/projected/a9c41abc-f805-4876-a5cb-d7e883e5b450-kube-api-access-8kn2q\") pod \"console-7bd787488f-p94lz\" (UID: \"a9c41abc-f805-4876-a5cb-d7e883e5b450\") " pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:24:27.878002 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.877947 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9c41abc-f805-4876-a5cb-d7e883e5b450-console-serving-cert\") pod \"console-7bd787488f-p94lz\" (UID: \"a9c41abc-f805-4876-a5cb-d7e883e5b450\") " pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:24:27.878782 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.878327 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9c41abc-f805-4876-a5cb-d7e883e5b450-service-ca\") pod \"console-7bd787488f-p94lz\" (UID: \"a9c41abc-f805-4876-a5cb-d7e883e5b450\") " pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:24:27.878782 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.878426 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9c41abc-f805-4876-a5cb-d7e883e5b450-console-oauth-config\") pod \"console-7bd787488f-p94lz\" (UID: \"a9c41abc-f805-4876-a5cb-d7e883e5b450\") " pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:24:27.878782 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.878655 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9c41abc-f805-4876-a5cb-d7e883e5b450-console-config\") pod \"console-7bd787488f-p94lz\" (UID: \"a9c41abc-f805-4876-a5cb-d7e883e5b450\") " pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:24:27.879368 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.879330 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9c41abc-f805-4876-a5cb-d7e883e5b450-service-ca\") pod \"console-7bd787488f-p94lz\" (UID: \"a9c41abc-f805-4876-a5cb-d7e883e5b450\") " pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:24:27.879472 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.879337 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9c41abc-f805-4876-a5cb-d7e883e5b450-oauth-serving-cert\") pod \"console-7bd787488f-p94lz\" (UID: \"a9c41abc-f805-4876-a5cb-d7e883e5b450\") " pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:24:27.880887 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.880842 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9c41abc-f805-4876-a5cb-d7e883e5b450-console-serving-cert\") pod \"console-7bd787488f-p94lz\" (UID: \"a9c41abc-f805-4876-a5cb-d7e883e5b450\") " pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:24:27.881262 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.881242 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9c41abc-f805-4876-a5cb-d7e883e5b450-console-oauth-config\") pod \"console-7bd787488f-p94lz\" (UID: \"a9c41abc-f805-4876-a5cb-d7e883e5b450\") " pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:24:27.891110 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:27.891085 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kn2q\" (UniqueName: \"kubernetes.io/projected/a9c41abc-f805-4876-a5cb-d7e883e5b450-kube-api-access-8kn2q\") pod \"console-7bd787488f-p94lz\" (UID: \"a9c41abc-f805-4876-a5cb-d7e883e5b450\") " pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:24:28.000211 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:28.000176 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:24:28.463018 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:28.462993 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bd787488f-p94lz"] Apr 22 19:24:28.467677 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:24:28.467649 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9c41abc_f805_4876_a5cb_d7e883e5b450.slice/crio-f7c9b6ae6ffa0a09f9f7c2f818c8fd127c6d80ab35cc34fa9873db6cefe9fe87 WatchSource:0}: Error finding container f7c9b6ae6ffa0a09f9f7c2f818c8fd127c6d80ab35cc34fa9873db6cefe9fe87: Status 404 returned error can't find the container with id f7c9b6ae6ffa0a09f9f7c2f818c8fd127c6d80ab35cc34fa9873db6cefe9fe87 Apr 22 19:24:28.675062 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:28.674978 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9t2qh" event={"ID":"80b67f15-b534-4bb5-98c8-6566228be090","Type":"ContainerStarted","Data":"11b7bac878a9de4c15a6b8146cffd50baed194151f6abe203d8e349d20f341a8"} Apr 22 19:24:28.676042 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:28.676020 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bd787488f-p94lz" event={"ID":"a9c41abc-f805-4876-a5cb-d7e883e5b450","Type":"ContainerStarted","Data":"f7c9b6ae6ffa0a09f9f7c2f818c8fd127c6d80ab35cc34fa9873db6cefe9fe87"} Apr 22 19:24:28.696205 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:28.696157 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9t2qh" podStartSLOduration=1.221550334 podStartE2EDuration="3.696144161s" podCreationTimestamp="2026-04-22 19:24:25 +0000 UTC" firstStartedPulling="2026-04-22 19:24:25.912182068 +0000 UTC m=+58.031561359" lastFinishedPulling="2026-04-22 19:24:28.386775884 +0000 UTC m=+60.506155186" observedRunningTime="2026-04-22 19:24:28.695357751 +0000 UTC m=+60.814737041" watchObservedRunningTime="2026-04-22 19:24:28.696144161 +0000 UTC m=+60.815523447" Apr 22 19:24:28.834826 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:28.834781 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rsnsl" Apr 22 19:24:29.429007 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:29.428961 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kkgq2"] Apr 22 19:24:29.432396 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:29.432274 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kkgq2" Apr 22 19:24:29.435183 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:29.435157 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-cpzvl\"" Apr 22 19:24:29.435398 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:29.435378 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 19:24:29.443069 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:29.443001 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kkgq2"] Apr 22 19:24:29.490278 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:29.490234 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/0916d097-426d-4179-a5d0-b4cbdbeb9c21-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-kkgq2\" (UID: \"0916d097-426d-4179-a5d0-b4cbdbeb9c21\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kkgq2" Apr 22 19:24:29.591302 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:29.591262 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/0916d097-426d-4179-a5d0-b4cbdbeb9c21-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-kkgq2\" (UID: \"0916d097-426d-4179-a5d0-b4cbdbeb9c21\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kkgq2" Apr 22 19:24:29.591479 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:29.591421 2574 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 22 19:24:29.591557 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:29.591486 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0916d097-426d-4179-a5d0-b4cbdbeb9c21-tls-certificates podName:0916d097-426d-4179-a5d0-b4cbdbeb9c21 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:30.091467022 +0000 UTC m=+62.210846317 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/0916d097-426d-4179-a5d0-b4cbdbeb9c21-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-kkgq2" (UID: "0916d097-426d-4179-a5d0-b4cbdbeb9c21") : secret "prometheus-operator-admission-webhook-tls" not found Apr 22 19:24:30.095685 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:30.095636 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/0916d097-426d-4179-a5d0-b4cbdbeb9c21-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-kkgq2\" (UID: \"0916d097-426d-4179-a5d0-b4cbdbeb9c21\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kkgq2" Apr 22 19:24:30.098840 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:30.098778 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/0916d097-426d-4179-a5d0-b4cbdbeb9c21-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-kkgq2\" (UID: \"0916d097-426d-4179-a5d0-b4cbdbeb9c21\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kkgq2" Apr 22 19:24:30.347651 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:30.347544 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kkgq2" Apr 22 19:24:30.522984 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:30.522952 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kkgq2"] Apr 22 19:24:31.602723 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:24:31.602683 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0916d097_426d_4179_a5d0_b4cbdbeb9c21.slice/crio-a1fddc4b49c4784243e60b5856baba1ed17e9682244bab26b4df7f4283f51ffb WatchSource:0}: Error finding container a1fddc4b49c4784243e60b5856baba1ed17e9682244bab26b4df7f4283f51ffb: Status 404 returned error can't find the container with id a1fddc4b49c4784243e60b5856baba1ed17e9682244bab26b4df7f4283f51ffb Apr 22 19:24:31.685207 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:31.685183 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kkgq2" event={"ID":"0916d097-426d-4179-a5d0-b4cbdbeb9c21","Type":"ContainerStarted","Data":"a1fddc4b49c4784243e60b5856baba1ed17e9682244bab26b4df7f4283f51ffb"} Apr 22 19:24:32.690521 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:32.690474 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bd787488f-p94lz" event={"ID":"a9c41abc-f805-4876-a5cb-d7e883e5b450","Type":"ContainerStarted","Data":"1ad9521c9804a439963752345d463172583bea8a96269e28c7085ad26a2a91f2"} Apr 22 19:24:32.708309 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:32.708259 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7bd787488f-p94lz" podStartSLOduration=2.5185313430000003 podStartE2EDuration="5.708245763s" podCreationTimestamp="2026-04-22 19:24:27 +0000 UTC" firstStartedPulling="2026-04-22 19:24:28.469288949 +0000 UTC m=+60.588668239" lastFinishedPulling="2026-04-22 19:24:31.65900336 +0000 UTC m=+63.778382659" observedRunningTime="2026-04-22 19:24:32.707153431 +0000 UTC m=+64.826532774" watchObservedRunningTime="2026-04-22 19:24:32.708245763 +0000 UTC m=+64.827625072" Apr 22 19:24:33.658069 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:33.658032 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jb2x5" Apr 22 19:24:33.695237 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:33.695173 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kkgq2" event={"ID":"0916d097-426d-4179-a5d0-b4cbdbeb9c21","Type":"ContainerStarted","Data":"aa401dcaa962ea4ae465852e932d69586b76dfff47f270c4ec575d0331282a6b"} Apr 22 19:24:33.695702 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:33.695479 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kkgq2" Apr 22 19:24:33.702565 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:33.702537 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kkgq2" Apr 22 19:24:33.715096 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:33.715052 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kkgq2" podStartSLOduration=3.39719574 podStartE2EDuration="4.71503889s" podCreationTimestamp="2026-04-22 19:24:29 +0000 UTC" firstStartedPulling="2026-04-22 19:24:31.605184329 +0000 UTC m=+63.724563616" lastFinishedPulling="2026-04-22 19:24:32.923027476 +0000 UTC m=+65.042406766" observedRunningTime="2026-04-22 19:24:33.713962975 +0000 UTC m=+65.833342288" watchObservedRunningTime="2026-04-22 19:24:33.71503889 +0000 UTC m=+65.834418198" Apr 22 19:24:34.234753 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:34.234709 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c461896-346c-4de1-9362-b9f83bd3486d-metrics-certs\") pod \"network-metrics-daemon-nndbq\" (UID: \"1c461896-346c-4de1-9362-b9f83bd3486d\") " pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:24:34.237331 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:34.237296 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c461896-346c-4de1-9362-b9f83bd3486d-metrics-certs\") pod \"network-metrics-daemon-nndbq\" (UID: \"1c461896-346c-4de1-9362-b9f83bd3486d\") " pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:24:34.447887 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:34.447836 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tqt8b\"" Apr 22 19:24:34.456013 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:34.455987 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nndbq" Apr 22 19:24:34.590429 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:34.590396 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nndbq"] Apr 22 19:24:34.593304 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:24:34.593268 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c461896_346c_4de1_9362_b9f83bd3486d.slice/crio-b09768ab5709452b30429fb7a2d199ed4a39d5f49e9dafd8a664f8000bad0a48 WatchSource:0}: Error finding container b09768ab5709452b30429fb7a2d199ed4a39d5f49e9dafd8a664f8000bad0a48: Status 404 returned error can't find the container with id b09768ab5709452b30429fb7a2d199ed4a39d5f49e9dafd8a664f8000bad0a48 Apr 22 19:24:34.698946 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:34.698905 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nndbq" event={"ID":"1c461896-346c-4de1-9362-b9f83bd3486d","Type":"ContainerStarted","Data":"b09768ab5709452b30429fb7a2d199ed4a39d5f49e9dafd8a664f8000bad0a48"} Apr 22 19:24:35.467926 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:35.467569 2574 patch_prober.go:28] interesting pod/image-registry-67c7f85b4c-gsf88 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 19:24:35.467926 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:35.467674 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" podUID="fd234ae5-2ef2-482b-9874-6902bc15a04a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:24:36.706977 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:36.706941 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nndbq" event={"ID":"1c461896-346c-4de1-9362-b9f83bd3486d","Type":"ContainerStarted","Data":"d789cb6c436cd9bc43c4893d2219966ccfdf8043b4f1cef6f7343db32c52c511"} Apr 22 19:24:37.711394 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:37.711349 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nndbq" event={"ID":"1c461896-346c-4de1-9362-b9f83bd3486d","Type":"ContainerStarted","Data":"b20f6a4991f5ecc78f004e57284628be516dae7ce509a62fbf7c8d2b75f2a3b3"} Apr 22 19:24:37.730175 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:37.730126 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-nndbq" podStartSLOduration=67.941439886 podStartE2EDuration="1m9.730112017s" podCreationTimestamp="2026-04-22 19:23:28 +0000 UTC" firstStartedPulling="2026-04-22 19:24:34.595684363 +0000 UTC m=+66.715063653" lastFinishedPulling="2026-04-22 19:24:36.384356497 +0000 UTC m=+68.503735784" observedRunningTime="2026-04-22 19:24:37.728360427 +0000 UTC m=+69.847739735" watchObservedRunningTime="2026-04-22 19:24:37.730112017 +0000 UTC m=+69.849491325" Apr 22 19:24:38.000917 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:38.000876 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:24:38.000917 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:38.000933 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:24:38.006373 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:38.006348 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:24:38.608969 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:38.608942 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bllx4" Apr 22 19:24:38.718747 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:38.718718 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:24:38.897471 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:38.897388 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-q6jjs"] Apr 22 19:24:38.939328 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:38.939291 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:38.944019 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:38.943989 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 19:24:38.944231 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:38.944216 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 19:24:38.944490 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:38.944471 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 19:24:38.944673 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:38.944655 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 19:24:38.944730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:38.944679 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-kkgwm\"" Apr 22 19:24:38.945233 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:38.944892 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 19:24:38.945233 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:38.945100 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 19:24:38.975354 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:38.975316 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-node-exporter-tls\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:38.975548 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:38.975383 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-metrics-client-ca\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:38.975548 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:38.975472 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-node-exporter-wtmp\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:38.975548 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:38.975522 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-sys\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:38.975705 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:38.975557 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-node-exporter-textfile\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:38.975705 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:38.975583 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncqhh\" (UniqueName: \"kubernetes.io/projected/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-kube-api-access-ncqhh\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:38.975705 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:38.975671 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-root\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:38.975884 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:38.975727 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:38.975884 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:38.975770 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-node-exporter-accelerators-collector-config\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:39.076937 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:39.076845 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-root\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:39.076937 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:39.076903 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:39.076937 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:39.076930 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-node-exporter-accelerators-collector-config\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:39.077263 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:39.076966 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-node-exporter-tls\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:39.077263 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:39.077001 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-metrics-client-ca\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:39.077263 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:39.077039 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-node-exporter-wtmp\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:39.077263 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:39.077074 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-sys\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:39.077263 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:39.077103 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-node-exporter-textfile\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:39.077263 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:39.077126 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ncqhh\" (UniqueName: \"kubernetes.io/projected/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-kube-api-access-ncqhh\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:39.077563 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:39.077526 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-root\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:39.080394 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:39.078451 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-metrics-client-ca\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:39.080394 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:39.078884 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-node-exporter-accelerators-collector-config\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:39.080394 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:39.078974 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 19:24:39.080394 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:39.080000 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-sys\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:39.080394 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:39.080123 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-node-exporter-wtmp\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:39.080394 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:39.080354 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-node-exporter-textfile\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:39.080868 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:39.080850 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-node-exporter-tls podName:5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d nodeName:}" failed. No retries permitted until 2026-04-22 19:24:39.580826695 +0000 UTC m=+71.700205997 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-node-exporter-tls") pod "node-exporter-q6jjs" (UID: "5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d") : secret "node-exporter-tls" not found Apr 22 19:24:39.084264 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:39.084240 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:39.098216 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:39.098170 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncqhh\" (UniqueName: \"kubernetes.io/projected/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-kube-api-access-ncqhh\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:39.581136 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:39.581102 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-node-exporter-tls\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:39.583748 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:39.583718 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d-node-exporter-tls\") pod \"node-exporter-q6jjs\" (UID: \"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d\") " pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:39.854742 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:39.854661 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-q6jjs" Apr 22 19:24:45.081116 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:24:45.081085 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c86ce86_5a72_4a8d_8e4f_42bc351d2b4d.slice/crio-b13848dc99be14efa6829586b2f60f1bcb422143ec4995e72e76a88d5ff8db71 WatchSource:0}: Error finding container b13848dc99be14efa6829586b2f60f1bcb422143ec4995e72e76a88d5ff8db71: Status 404 returned error can't find the container with id b13848dc99be14efa6829586b2f60f1bcb422143ec4995e72e76a88d5ff8db71 Apr 22 19:24:45.467276 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:45.467192 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:45.737224 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:45.737120 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-q6jjs" event={"ID":"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d","Type":"ContainerStarted","Data":"b13848dc99be14efa6829586b2f60f1bcb422143ec4995e72e76a88d5ff8db71"} Apr 22 19:24:45.739766 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:45.739688 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-chhcp" event={"ID":"9656e049-2948-47bd-aec9-0bf4e3612f24","Type":"ContainerStarted","Data":"89d20cf5a43923670c3d649e4e0c4fae87085eb1d28c7808b44221ab5f791c98"} Apr 22 19:24:45.740558 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:45.740536 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-chhcp" Apr 22 19:24:45.750046 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:45.750001 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-chhcp" Apr 22 19:24:45.757898 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:45.757835 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-chhcp" podStartSLOduration=1.525223754 podStartE2EDuration="20.757815615s" podCreationTimestamp="2026-04-22 19:24:25 +0000 UTC" firstStartedPulling="2026-04-22 19:24:25.935364539 +0000 UTC m=+58.054743839" lastFinishedPulling="2026-04-22 19:24:45.167956407 +0000 UTC m=+77.287335700" observedRunningTime="2026-04-22 19:24:45.756494066 +0000 UTC m=+77.875873377" watchObservedRunningTime="2026-04-22 19:24:45.757815615 +0000 UTC m=+77.877194925" Apr 22 19:24:46.744631 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:46.744589 2574 generic.go:358] "Generic (PLEG): container finished" podID="5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d" containerID="14194138f22d1410bc9bce61a989cfeef1f689a54db57429166a31e6a09c99a7" exitCode=0 Apr 22 19:24:46.745112 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:46.744687 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-q6jjs" event={"ID":"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d","Type":"ContainerDied","Data":"14194138f22d1410bc9bce61a989cfeef1f689a54db57429166a31e6a09c99a7"} Apr 22 19:24:47.675866 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:47.675830 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-694bf65ddb-9n9qb" Apr 22 19:24:47.751011 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:47.750954 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-q6jjs" event={"ID":"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d","Type":"ContainerStarted","Data":"c70d4733b3c7b2198935f9876c0689755706a4efb08744c088c2a650e756cdc8"} Apr 22 19:24:47.751011 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:47.751015 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-q6jjs" event={"ID":"5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d","Type":"ContainerStarted","Data":"7ffc1320e52d16e3d65f8da7e592cdae2c19a2350aff2faf752bbde8167f8f28"} Apr 22 19:24:47.774497 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:47.774443 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-q6jjs" podStartSLOduration=9.004861618 podStartE2EDuration="9.774429887s" podCreationTimestamp="2026-04-22 19:24:38 +0000 UTC" firstStartedPulling="2026-04-22 19:24:45.083623283 +0000 UTC m=+77.203002577" lastFinishedPulling="2026-04-22 19:24:45.853191543 +0000 UTC m=+77.972570846" observedRunningTime="2026-04-22 19:24:47.773711791 +0000 UTC m=+79.893091122" watchObservedRunningTime="2026-04-22 19:24:47.774429887 +0000 UTC m=+79.893809206" Apr 22 19:24:50.478921 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.478849 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" podUID="fd234ae5-2ef2-482b-9874-6902bc15a04a" containerName="registry" containerID="cri-o://3680d2a17f5e74fb626a62bff67d23ddb62930b27aaadb11063ea04e57e2ab9c" gracePeriod=30 Apr 22 19:24:50.753921 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.753888 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:50.761458 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.761423 2574 generic.go:358] "Generic (PLEG): container finished" podID="fd234ae5-2ef2-482b-9874-6902bc15a04a" containerID="3680d2a17f5e74fb626a62bff67d23ddb62930b27aaadb11063ea04e57e2ab9c" exitCode=0 Apr 22 19:24:50.761582 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.761484 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" event={"ID":"fd234ae5-2ef2-482b-9874-6902bc15a04a","Type":"ContainerDied","Data":"3680d2a17f5e74fb626a62bff67d23ddb62930b27aaadb11063ea04e57e2ab9c"} Apr 22 19:24:50.761582 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.761500 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" Apr 22 19:24:50.761582 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.761515 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67c7f85b4c-gsf88" event={"ID":"fd234ae5-2ef2-482b-9874-6902bc15a04a","Type":"ContainerDied","Data":"c14db7660b8d4d253dad68f888843ba4a3b1b5c93f84b4a017beffd23d067872"} Apr 22 19:24:50.761582 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.761530 2574 scope.go:117] "RemoveContainer" containerID="3680d2a17f5e74fb626a62bff67d23ddb62930b27aaadb11063ea04e57e2ab9c" Apr 22 19:24:50.770395 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.770365 2574 scope.go:117] "RemoveContainer" containerID="3680d2a17f5e74fb626a62bff67d23ddb62930b27aaadb11063ea04e57e2ab9c" Apr 22 19:24:50.770729 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:24:50.770705 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3680d2a17f5e74fb626a62bff67d23ddb62930b27aaadb11063ea04e57e2ab9c\": container with ID starting with 3680d2a17f5e74fb626a62bff67d23ddb62930b27aaadb11063ea04e57e2ab9c not found: ID does not exist" containerID="3680d2a17f5e74fb626a62bff67d23ddb62930b27aaadb11063ea04e57e2ab9c" Apr 22 19:24:50.770914 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.770741 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3680d2a17f5e74fb626a62bff67d23ddb62930b27aaadb11063ea04e57e2ab9c"} err="failed to get container status \"3680d2a17f5e74fb626a62bff67d23ddb62930b27aaadb11063ea04e57e2ab9c\": rpc error: code = NotFound desc = could not find container \"3680d2a17f5e74fb626a62bff67d23ddb62930b27aaadb11063ea04e57e2ab9c\": container with ID starting with 3680d2a17f5e74fb626a62bff67d23ddb62930b27aaadb11063ea04e57e2ab9c not found: ID does not exist" Apr 22 19:24:50.884106 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.884070 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd234ae5-2ef2-482b-9874-6902bc15a04a-trusted-ca\") pod \"fd234ae5-2ef2-482b-9874-6902bc15a04a\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " Apr 22 19:24:50.884295 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.884143 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p9wm\" (UniqueName: \"kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-kube-api-access-9p9wm\") pod \"fd234ae5-2ef2-482b-9874-6902bc15a04a\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " Apr 22 19:24:50.884295 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.884173 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fd234ae5-2ef2-482b-9874-6902bc15a04a-installation-pull-secrets\") pod \"fd234ae5-2ef2-482b-9874-6902bc15a04a\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " Apr 22 19:24:50.884295 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.884209 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-certificates\") pod \"fd234ae5-2ef2-482b-9874-6902bc15a04a\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " Apr 22 19:24:50.884295 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.884248 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fd234ae5-2ef2-482b-9874-6902bc15a04a-image-registry-private-configuration\") pod \"fd234ae5-2ef2-482b-9874-6902bc15a04a\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " Apr 22 19:24:50.884295 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.884277 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fd234ae5-2ef2-482b-9874-6902bc15a04a-ca-trust-extracted\") pod \"fd234ae5-2ef2-482b-9874-6902bc15a04a\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " Apr 22 19:24:50.884501 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.884312 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-tls\") pod \"fd234ae5-2ef2-482b-9874-6902bc15a04a\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " Apr 22 19:24:50.884501 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.884335 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-bound-sa-token\") pod \"fd234ae5-2ef2-482b-9874-6902bc15a04a\" (UID: \"fd234ae5-2ef2-482b-9874-6902bc15a04a\") " Apr 22 19:24:50.884501 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.884490 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd234ae5-2ef2-482b-9874-6902bc15a04a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "fd234ae5-2ef2-482b-9874-6902bc15a04a" (UID: "fd234ae5-2ef2-482b-9874-6902bc15a04a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:24:50.884650 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.884573 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd234ae5-2ef2-482b-9874-6902bc15a04a-trusted-ca\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:24:50.884650 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.884627 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "fd234ae5-2ef2-482b-9874-6902bc15a04a" (UID: "fd234ae5-2ef2-482b-9874-6902bc15a04a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:24:50.887139 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.887111 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd234ae5-2ef2-482b-9874-6902bc15a04a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "fd234ae5-2ef2-482b-9874-6902bc15a04a" (UID: "fd234ae5-2ef2-482b-9874-6902bc15a04a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:24:50.887266 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.887140 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "fd234ae5-2ef2-482b-9874-6902bc15a04a" (UID: "fd234ae5-2ef2-482b-9874-6902bc15a04a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:24:50.887266 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.887117 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-kube-api-access-9p9wm" (OuterVolumeSpecName: "kube-api-access-9p9wm") pod "fd234ae5-2ef2-482b-9874-6902bc15a04a" (UID: "fd234ae5-2ef2-482b-9874-6902bc15a04a"). InnerVolumeSpecName "kube-api-access-9p9wm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:24:50.887382 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.887293 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "fd234ae5-2ef2-482b-9874-6902bc15a04a" (UID: "fd234ae5-2ef2-482b-9874-6902bc15a04a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:24:50.887382 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.887326 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd234ae5-2ef2-482b-9874-6902bc15a04a-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "fd234ae5-2ef2-482b-9874-6902bc15a04a" (UID: "fd234ae5-2ef2-482b-9874-6902bc15a04a"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:24:50.896049 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.896023 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd234ae5-2ef2-482b-9874-6902bc15a04a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "fd234ae5-2ef2-482b-9874-6902bc15a04a" (UID: "fd234ae5-2ef2-482b-9874-6902bc15a04a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:24:50.985707 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.985665 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9p9wm\" (UniqueName: \"kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-kube-api-access-9p9wm\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:24:50.985707 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.985707 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fd234ae5-2ef2-482b-9874-6902bc15a04a-installation-pull-secrets\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:24:50.985932 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.985723 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-certificates\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:24:50.985932 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.985740 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fd234ae5-2ef2-482b-9874-6902bc15a04a-image-registry-private-configuration\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:24:50.985932 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.985756 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fd234ae5-2ef2-482b-9874-6902bc15a04a-ca-trust-extracted\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:24:50.985932 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.985773 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-registry-tls\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:24:50.985932 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:50.985789 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd234ae5-2ef2-482b-9874-6902bc15a04a-bound-sa-token\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:24:51.085359 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:51.085324 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-67c7f85b4c-gsf88"] Apr 22 19:24:51.089091 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:51.089066 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-67c7f85b4c-gsf88"] Apr 22 19:24:52.429049 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:52.429014 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd234ae5-2ef2-482b-9874-6902bc15a04a" path="/var/lib/kubelet/pods/fd234ae5-2ef2-482b-9874-6902bc15a04a/volumes" Apr 22 19:24:59.789310 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:24:59.789275 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7bd787488f-p94lz"] Apr 22 19:25:24.808420 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:24.808361 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7bd787488f-p94lz" podUID="a9c41abc-f805-4876-a5cb-d7e883e5b450" containerName="console" containerID="cri-o://1ad9521c9804a439963752345d463172583bea8a96269e28c7085ad26a2a91f2" gracePeriod=15 Apr 22 19:25:25.048851 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.048829 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7bd787488f-p94lz_a9c41abc-f805-4876-a5cb-d7e883e5b450/console/0.log" Apr 22 19:25:25.048982 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.048886 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:25:25.129223 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.129131 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kn2q\" (UniqueName: \"kubernetes.io/projected/a9c41abc-f805-4876-a5cb-d7e883e5b450-kube-api-access-8kn2q\") pod \"a9c41abc-f805-4876-a5cb-d7e883e5b450\" (UID: \"a9c41abc-f805-4876-a5cb-d7e883e5b450\") " Apr 22 19:25:25.129223 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.129179 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9c41abc-f805-4876-a5cb-d7e883e5b450-console-oauth-config\") pod \"a9c41abc-f805-4876-a5cb-d7e883e5b450\" (UID: \"a9c41abc-f805-4876-a5cb-d7e883e5b450\") " Apr 22 19:25:25.129223 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.129217 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9c41abc-f805-4876-a5cb-d7e883e5b450-console-serving-cert\") pod \"a9c41abc-f805-4876-a5cb-d7e883e5b450\" (UID: \"a9c41abc-f805-4876-a5cb-d7e883e5b450\") " Apr 22 19:25:25.129489 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.129238 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9c41abc-f805-4876-a5cb-d7e883e5b450-oauth-serving-cert\") pod \"a9c41abc-f805-4876-a5cb-d7e883e5b450\" (UID: \"a9c41abc-f805-4876-a5cb-d7e883e5b450\") " Apr 22 19:25:25.129489 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.129257 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9c41abc-f805-4876-a5cb-d7e883e5b450-service-ca\") pod \"a9c41abc-f805-4876-a5cb-d7e883e5b450\" (UID: \"a9c41abc-f805-4876-a5cb-d7e883e5b450\") " Apr 22 19:25:25.129489 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.129280 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9c41abc-f805-4876-a5cb-d7e883e5b450-console-config\") pod \"a9c41abc-f805-4876-a5cb-d7e883e5b450\" (UID: \"a9c41abc-f805-4876-a5cb-d7e883e5b450\") " Apr 22 19:25:25.129708 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.129678 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9c41abc-f805-4876-a5cb-d7e883e5b450-service-ca" (OuterVolumeSpecName: "service-ca") pod "a9c41abc-f805-4876-a5cb-d7e883e5b450" (UID: "a9c41abc-f805-4876-a5cb-d7e883e5b450"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:25:25.129779 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.129706 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9c41abc-f805-4876-a5cb-d7e883e5b450-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a9c41abc-f805-4876-a5cb-d7e883e5b450" (UID: "a9c41abc-f805-4876-a5cb-d7e883e5b450"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:25:25.129779 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.129744 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9c41abc-f805-4876-a5cb-d7e883e5b450-console-config" (OuterVolumeSpecName: "console-config") pod "a9c41abc-f805-4876-a5cb-d7e883e5b450" (UID: "a9c41abc-f805-4876-a5cb-d7e883e5b450"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:25:25.131570 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.131538 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9c41abc-f805-4876-a5cb-d7e883e5b450-kube-api-access-8kn2q" (OuterVolumeSpecName: "kube-api-access-8kn2q") pod "a9c41abc-f805-4876-a5cb-d7e883e5b450" (UID: "a9c41abc-f805-4876-a5cb-d7e883e5b450"). InnerVolumeSpecName "kube-api-access-8kn2q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:25:25.131570 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.131551 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9c41abc-f805-4876-a5cb-d7e883e5b450-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a9c41abc-f805-4876-a5cb-d7e883e5b450" (UID: "a9c41abc-f805-4876-a5cb-d7e883e5b450"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:25:25.131721 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.131632 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9c41abc-f805-4876-a5cb-d7e883e5b450-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a9c41abc-f805-4876-a5cb-d7e883e5b450" (UID: "a9c41abc-f805-4876-a5cb-d7e883e5b450"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:25:25.230155 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.230113 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9c41abc-f805-4876-a5cb-d7e883e5b450-service-ca\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:25:25.230155 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.230147 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9c41abc-f805-4876-a5cb-d7e883e5b450-console-config\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:25:25.230359 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.230159 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8kn2q\" (UniqueName: \"kubernetes.io/projected/a9c41abc-f805-4876-a5cb-d7e883e5b450-kube-api-access-8kn2q\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:25:25.230359 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.230181 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9c41abc-f805-4876-a5cb-d7e883e5b450-console-oauth-config\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:25:25.230359 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.230193 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9c41abc-f805-4876-a5cb-d7e883e5b450-console-serving-cert\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:25:25.230359 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.230204 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9c41abc-f805-4876-a5cb-d7e883e5b450-oauth-serving-cert\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:25:25.864670 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.864639 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7bd787488f-p94lz_a9c41abc-f805-4876-a5cb-d7e883e5b450/console/0.log" Apr 22 19:25:25.865153 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.864680 2574 generic.go:358] "Generic (PLEG): container finished" podID="a9c41abc-f805-4876-a5cb-d7e883e5b450" containerID="1ad9521c9804a439963752345d463172583bea8a96269e28c7085ad26a2a91f2" exitCode=2 Apr 22 19:25:25.865153 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.864740 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bd787488f-p94lz" event={"ID":"a9c41abc-f805-4876-a5cb-d7e883e5b450","Type":"ContainerDied","Data":"1ad9521c9804a439963752345d463172583bea8a96269e28c7085ad26a2a91f2"} Apr 22 19:25:25.865153 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.864761 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bd787488f-p94lz" Apr 22 19:25:25.865153 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.864781 2574 scope.go:117] "RemoveContainer" containerID="1ad9521c9804a439963752345d463172583bea8a96269e28c7085ad26a2a91f2" Apr 22 19:25:25.865153 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.864770 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bd787488f-p94lz" event={"ID":"a9c41abc-f805-4876-a5cb-d7e883e5b450","Type":"ContainerDied","Data":"f7c9b6ae6ffa0a09f9f7c2f818c8fd127c6d80ab35cc34fa9873db6cefe9fe87"} Apr 22 19:25:25.872661 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.872394 2574 scope.go:117] "RemoveContainer" containerID="1ad9521c9804a439963752345d463172583bea8a96269e28c7085ad26a2a91f2" Apr 22 19:25:25.872728 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:25:25.872681 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ad9521c9804a439963752345d463172583bea8a96269e28c7085ad26a2a91f2\": container with ID starting with 1ad9521c9804a439963752345d463172583bea8a96269e28c7085ad26a2a91f2 not found: ID does not exist" containerID="1ad9521c9804a439963752345d463172583bea8a96269e28c7085ad26a2a91f2" Apr 22 19:25:25.872764 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.872717 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ad9521c9804a439963752345d463172583bea8a96269e28c7085ad26a2a91f2"} err="failed to get container status \"1ad9521c9804a439963752345d463172583bea8a96269e28c7085ad26a2a91f2\": rpc error: code = NotFound desc = could not find container \"1ad9521c9804a439963752345d463172583bea8a96269e28c7085ad26a2a91f2\": container with ID starting with 1ad9521c9804a439963752345d463172583bea8a96269e28c7085ad26a2a91f2 not found: ID does not exist" Apr 22 19:25:25.885053 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.885022 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7bd787488f-p94lz"] Apr 22 19:25:25.887295 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:25.887272 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7bd787488f-p94lz"] Apr 22 19:25:26.428897 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:25:26.428864 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9c41abc-f805-4876-a5cb-d7e883e5b450" path="/var/lib/kubelet/pods/a9c41abc-f805-4876-a5cb-d7e883e5b450/volumes" Apr 22 19:26:02.203961 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.203927 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7c688944ff-648w2"] Apr 22 19:26:02.204423 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.204157 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd234ae5-2ef2-482b-9874-6902bc15a04a" containerName="registry" Apr 22 19:26:02.204423 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.204168 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd234ae5-2ef2-482b-9874-6902bc15a04a" containerName="registry" Apr 22 19:26:02.204423 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.204180 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9c41abc-f805-4876-a5cb-d7e883e5b450" containerName="console" Apr 22 19:26:02.204423 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.204185 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9c41abc-f805-4876-a5cb-d7e883e5b450" containerName="console" Apr 22 19:26:02.204423 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.204227 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9c41abc-f805-4876-a5cb-d7e883e5b450" containerName="console" Apr 22 19:26:02.204423 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.204234 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd234ae5-2ef2-482b-9874-6902bc15a04a" containerName="registry" Apr 22 19:26:02.207069 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.207051 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:02.211491 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.211467 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 19:26:02.211603 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.211520 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-skskp\"" Apr 22 19:26:02.211603 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.211560 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 19:26:02.212917 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.212894 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 19:26:02.213034 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.212940 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 19:26:02.213034 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.212965 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 19:26:02.219927 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.219902 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 19:26:02.223772 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.223750 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c688944ff-648w2"] Apr 22 19:26:02.283356 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.283322 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbgf6\" (UniqueName: \"kubernetes.io/projected/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-kube-api-access-hbgf6\") pod \"console-7c688944ff-648w2\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:02.283494 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.283361 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-oauth-serving-cert\") pod \"console-7c688944ff-648w2\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:02.283494 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.283386 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-console-oauth-config\") pod \"console-7c688944ff-648w2\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:02.283494 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.283405 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-service-ca\") pod \"console-7c688944ff-648w2\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:02.283494 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.283465 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-console-config\") pod \"console-7c688944ff-648w2\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:02.283628 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.283518 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-trusted-ca-bundle\") pod \"console-7c688944ff-648w2\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:02.283628 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.283569 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-console-serving-cert\") pod \"console-7c688944ff-648w2\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:02.384712 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.384683 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbgf6\" (UniqueName: \"kubernetes.io/projected/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-kube-api-access-hbgf6\") pod \"console-7c688944ff-648w2\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:02.384832 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.384719 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-oauth-serving-cert\") pod \"console-7c688944ff-648w2\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:02.384832 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.384742 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-console-oauth-config\") pod \"console-7c688944ff-648w2\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:02.384938 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.384918 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-service-ca\") pod \"console-7c688944ff-648w2\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:02.384999 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.384962 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-console-config\") pod \"console-7c688944ff-648w2\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:02.385050 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.385031 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-trusted-ca-bundle\") pod \"console-7c688944ff-648w2\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:02.385100 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.385069 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-console-serving-cert\") pod \"console-7c688944ff-648w2\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:02.385550 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.385527 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-oauth-serving-cert\") pod \"console-7c688944ff-648w2\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:02.385654 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.385607 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-console-config\") pod \"console-7c688944ff-648w2\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:02.385654 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.385606 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-service-ca\") pod \"console-7c688944ff-648w2\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:02.385845 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.385824 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-trusted-ca-bundle\") pod \"console-7c688944ff-648w2\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:02.387243 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.387221 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-console-oauth-config\") pod \"console-7c688944ff-648w2\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:02.387388 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.387372 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-console-serving-cert\") pod \"console-7c688944ff-648w2\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:02.393355 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.393335 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbgf6\" (UniqueName: \"kubernetes.io/projected/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-kube-api-access-hbgf6\") pod \"console-7c688944ff-648w2\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:02.516436 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.516400 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:02.635993 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:26:02.635960 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e4b171f_8770_4cf9_8d75_4d14c8c2a99f.slice/crio-39acea018db163e574ab85a43b2e1e894d90db2221fec3da722c55b0a0c5bffd WatchSource:0}: Error finding container 39acea018db163e574ab85a43b2e1e894d90db2221fec3da722c55b0a0c5bffd: Status 404 returned error can't find the container with id 39acea018db163e574ab85a43b2e1e894d90db2221fec3da722c55b0a0c5bffd Apr 22 19:26:02.637313 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.637285 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c688944ff-648w2"] Apr 22 19:26:02.969433 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.969353 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c688944ff-648w2" event={"ID":"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f","Type":"ContainerStarted","Data":"0c0db87b88d328be99872b773d9a2c028ffadedc6996484a02bfde38fbe6e0be"} Apr 22 19:26:02.969433 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.969386 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c688944ff-648w2" event={"ID":"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f","Type":"ContainerStarted","Data":"39acea018db163e574ab85a43b2e1e894d90db2221fec3da722c55b0a0c5bffd"} Apr 22 19:26:02.988342 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:02.988296 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7c688944ff-648w2" podStartSLOduration=0.988283933 podStartE2EDuration="988.283933ms" podCreationTimestamp="2026-04-22 19:26:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:26:02.987454366 +0000 UTC m=+155.106833667" watchObservedRunningTime="2026-04-22 19:26:02.988283933 +0000 UTC m=+155.107663242" Apr 22 19:26:12.516549 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:12.516502 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:12.516549 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:12.516552 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:12.521260 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:12.521241 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:26:12.997742 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:26:12.997716 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:27:10.074763 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:27:10.074725 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d84c955d9-xxhxb"] Apr 22 19:27:10.077681 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:27:10.077659 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d84c955d9-xxhxb" Apr 22 19:27:10.081436 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:27:10.081408 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 19:27:10.081436 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:27:10.081430 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 19:27:10.081621 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:27:10.081412 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-s4trd\"" Apr 22 19:27:10.081621 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:27:10.081409 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 19:27:10.081621 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:27:10.081409 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 19:27:10.086987 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:27:10.086959 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d84c955d9-xxhxb"] Apr 22 19:27:10.175438 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:27:10.175405 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdgrs\" (UniqueName: \"kubernetes.io/projected/1276e78c-a947-44cd-ac5e-5d174ab0b57e-kube-api-access-qdgrs\") pod \"managed-serviceaccount-addon-agent-d84c955d9-xxhxb\" (UID: \"1276e78c-a947-44cd-ac5e-5d174ab0b57e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d84c955d9-xxhxb" Apr 22 19:27:10.175585 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:27:10.175500 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1276e78c-a947-44cd-ac5e-5d174ab0b57e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-d84c955d9-xxhxb\" (UID: \"1276e78c-a947-44cd-ac5e-5d174ab0b57e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d84c955d9-xxhxb" Apr 22 19:27:10.276844 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:27:10.276795 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdgrs\" (UniqueName: \"kubernetes.io/projected/1276e78c-a947-44cd-ac5e-5d174ab0b57e-kube-api-access-qdgrs\") pod \"managed-serviceaccount-addon-agent-d84c955d9-xxhxb\" (UID: \"1276e78c-a947-44cd-ac5e-5d174ab0b57e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d84c955d9-xxhxb" Apr 22 19:27:10.276992 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:27:10.276875 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1276e78c-a947-44cd-ac5e-5d174ab0b57e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-d84c955d9-xxhxb\" (UID: \"1276e78c-a947-44cd-ac5e-5d174ab0b57e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d84c955d9-xxhxb" Apr 22 19:27:10.279320 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:27:10.279281 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1276e78c-a947-44cd-ac5e-5d174ab0b57e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-d84c955d9-xxhxb\" (UID: \"1276e78c-a947-44cd-ac5e-5d174ab0b57e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d84c955d9-xxhxb" Apr 22 19:27:10.285795 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:27:10.285774 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdgrs\" (UniqueName: \"kubernetes.io/projected/1276e78c-a947-44cd-ac5e-5d174ab0b57e-kube-api-access-qdgrs\") pod \"managed-serviceaccount-addon-agent-d84c955d9-xxhxb\" (UID: \"1276e78c-a947-44cd-ac5e-5d174ab0b57e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d84c955d9-xxhxb" Apr 22 19:27:10.400349 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:27:10.400251 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d84c955d9-xxhxb" Apr 22 19:27:10.517147 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:27:10.517116 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d84c955d9-xxhxb"] Apr 22 19:27:10.520435 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:27:10.520410 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1276e78c_a947_44cd_ac5e_5d174ab0b57e.slice/crio-106e333ebb698c2275e7caac65bb79b033062ddba35d04c3e0d64e951e406242 WatchSource:0}: Error finding container 106e333ebb698c2275e7caac65bb79b033062ddba35d04c3e0d64e951e406242: Status 404 returned error can't find the container with id 106e333ebb698c2275e7caac65bb79b033062ddba35d04c3e0d64e951e406242 Apr 22 19:27:11.146174 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:27:11.146138 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d84c955d9-xxhxb" event={"ID":"1276e78c-a947-44cd-ac5e-5d174ab0b57e","Type":"ContainerStarted","Data":"106e333ebb698c2275e7caac65bb79b033062ddba35d04c3e0d64e951e406242"} Apr 22 19:27:13.153349 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:27:13.153315 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d84c955d9-xxhxb" event={"ID":"1276e78c-a947-44cd-ac5e-5d174ab0b57e","Type":"ContainerStarted","Data":"a533077b65de6462f3e387b5190e237d691a7dc5f2e4652661cfc795530ca32d"} Apr 22 19:27:13.170396 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:27:13.170345 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d84c955d9-xxhxb" podStartSLOduration=0.71166199 podStartE2EDuration="3.170333034s" podCreationTimestamp="2026-04-22 19:27:10 +0000 UTC" firstStartedPulling="2026-04-22 19:27:10.522450952 +0000 UTC m=+222.641830239" lastFinishedPulling="2026-04-22 19:27:12.981121983 +0000 UTC m=+225.100501283" observedRunningTime="2026-04-22 19:27:13.168566976 +0000 UTC m=+225.287946306" watchObservedRunningTime="2026-04-22 19:27:13.170333034 +0000 UTC m=+225.289712378" Apr 22 19:28:05.647152 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:05.647114 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4"] Apr 22 19:28:05.652430 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:05.652411 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4" Apr 22 19:28:05.657018 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:05.656997 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 19:28:05.657140 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:05.657057 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 19:28:05.657140 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:05.657003 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-pjwps\"" Apr 22 19:28:05.661601 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:05.661580 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4"] Apr 22 19:28:05.771054 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:05.771027 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b655572-9ec7-43be-9348-505cf5a24218-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4\" (UID: \"1b655572-9ec7-43be-9348-505cf5a24218\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4" Apr 22 19:28:05.771207 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:05.771062 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b655572-9ec7-43be-9348-505cf5a24218-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4\" (UID: \"1b655572-9ec7-43be-9348-505cf5a24218\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4" Apr 22 19:28:05.771207 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:05.771090 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xbs2\" (UniqueName: \"kubernetes.io/projected/1b655572-9ec7-43be-9348-505cf5a24218-kube-api-access-6xbs2\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4\" (UID: \"1b655572-9ec7-43be-9348-505cf5a24218\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4" Apr 22 19:28:05.871696 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:05.871658 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b655572-9ec7-43be-9348-505cf5a24218-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4\" (UID: \"1b655572-9ec7-43be-9348-505cf5a24218\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4" Apr 22 19:28:05.871696 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:05.871700 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b655572-9ec7-43be-9348-505cf5a24218-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4\" (UID: \"1b655572-9ec7-43be-9348-505cf5a24218\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4" Apr 22 19:28:05.871950 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:05.871721 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6xbs2\" (UniqueName: \"kubernetes.io/projected/1b655572-9ec7-43be-9348-505cf5a24218-kube-api-access-6xbs2\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4\" (UID: \"1b655572-9ec7-43be-9348-505cf5a24218\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4" Apr 22 19:28:05.872113 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:05.872090 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b655572-9ec7-43be-9348-505cf5a24218-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4\" (UID: \"1b655572-9ec7-43be-9348-505cf5a24218\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4" Apr 22 19:28:05.872113 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:05.872105 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b655572-9ec7-43be-9348-505cf5a24218-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4\" (UID: \"1b655572-9ec7-43be-9348-505cf5a24218\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4" Apr 22 19:28:05.880378 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:05.880358 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xbs2\" (UniqueName: \"kubernetes.io/projected/1b655572-9ec7-43be-9348-505cf5a24218-kube-api-access-6xbs2\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4\" (UID: \"1b655572-9ec7-43be-9348-505cf5a24218\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4" Apr 22 19:28:05.961335 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:05.961264 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4" Apr 22 19:28:06.079510 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:06.079370 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4"] Apr 22 19:28:06.082743 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:28:06.082708 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b655572_9ec7_43be_9348_505cf5a24218.slice/crio-0e0acf5039dd57fe1e75d1443b34bf19184da31e3c4824d3f53055707925da25 WatchSource:0}: Error finding container 0e0acf5039dd57fe1e75d1443b34bf19184da31e3c4824d3f53055707925da25: Status 404 returned error can't find the container with id 0e0acf5039dd57fe1e75d1443b34bf19184da31e3c4824d3f53055707925da25 Apr 22 19:28:06.288123 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:06.288090 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4" event={"ID":"1b655572-9ec7-43be-9348-505cf5a24218","Type":"ContainerStarted","Data":"0e0acf5039dd57fe1e75d1443b34bf19184da31e3c4824d3f53055707925da25"} Apr 22 19:28:12.307853 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:12.307795 2574 generic.go:358] "Generic (PLEG): container finished" podID="1b655572-9ec7-43be-9348-505cf5a24218" containerID="e98c2ae9bc9ae2f202fb62d0395c7aa129c37553986b57562c40d88672c3f32f" exitCode=0 Apr 22 19:28:12.308241 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:12.307892 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4" event={"ID":"1b655572-9ec7-43be-9348-505cf5a24218","Type":"ContainerDied","Data":"e98c2ae9bc9ae2f202fb62d0395c7aa129c37553986b57562c40d88672c3f32f"} Apr 22 19:28:15.316503 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:15.316466 2574 generic.go:358] "Generic (PLEG): container finished" podID="1b655572-9ec7-43be-9348-505cf5a24218" containerID="c23c08e12acd73bfe06c7f70308f8fd4b9ca54fa9737b0b11b20a3f88ec1922f" exitCode=0 Apr 22 19:28:15.316883 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:15.316530 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4" event={"ID":"1b655572-9ec7-43be-9348-505cf5a24218","Type":"ContainerDied","Data":"c23c08e12acd73bfe06c7f70308f8fd4b9ca54fa9737b0b11b20a3f88ec1922f"} Apr 22 19:28:21.335067 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:21.335030 2574 generic.go:358] "Generic (PLEG): container finished" podID="1b655572-9ec7-43be-9348-505cf5a24218" containerID="7fe1cd63b28c0971fdd155e2816019f7a3334930f0b11b0a39a2189885b886e6" exitCode=0 Apr 22 19:28:21.335443 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:21.335113 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4" event={"ID":"1b655572-9ec7-43be-9348-505cf5a24218","Type":"ContainerDied","Data":"7fe1cd63b28c0971fdd155e2816019f7a3334930f0b11b0a39a2189885b886e6"} Apr 22 19:28:22.454638 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:22.454614 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4" Apr 22 19:28:22.602792 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:22.602708 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b655572-9ec7-43be-9348-505cf5a24218-bundle\") pod \"1b655572-9ec7-43be-9348-505cf5a24218\" (UID: \"1b655572-9ec7-43be-9348-505cf5a24218\") " Apr 22 19:28:22.602964 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:22.602821 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xbs2\" (UniqueName: \"kubernetes.io/projected/1b655572-9ec7-43be-9348-505cf5a24218-kube-api-access-6xbs2\") pod \"1b655572-9ec7-43be-9348-505cf5a24218\" (UID: \"1b655572-9ec7-43be-9348-505cf5a24218\") " Apr 22 19:28:22.602964 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:22.602866 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b655572-9ec7-43be-9348-505cf5a24218-util\") pod \"1b655572-9ec7-43be-9348-505cf5a24218\" (UID: \"1b655572-9ec7-43be-9348-505cf5a24218\") " Apr 22 19:28:22.603346 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:22.603303 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b655572-9ec7-43be-9348-505cf5a24218-bundle" (OuterVolumeSpecName: "bundle") pod "1b655572-9ec7-43be-9348-505cf5a24218" (UID: "1b655572-9ec7-43be-9348-505cf5a24218"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:28:22.605031 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:22.605003 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b655572-9ec7-43be-9348-505cf5a24218-kube-api-access-6xbs2" (OuterVolumeSpecName: "kube-api-access-6xbs2") pod "1b655572-9ec7-43be-9348-505cf5a24218" (UID: "1b655572-9ec7-43be-9348-505cf5a24218"). InnerVolumeSpecName "kube-api-access-6xbs2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:28:22.607296 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:22.607262 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b655572-9ec7-43be-9348-505cf5a24218-util" (OuterVolumeSpecName: "util") pod "1b655572-9ec7-43be-9348-505cf5a24218" (UID: "1b655572-9ec7-43be-9348-505cf5a24218"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:28:22.703411 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:22.703373 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b655572-9ec7-43be-9348-505cf5a24218-util\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:28:22.703563 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:22.703425 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b655572-9ec7-43be-9348-505cf5a24218-bundle\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:28:22.703563 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:22.703435 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6xbs2\" (UniqueName: \"kubernetes.io/projected/1b655572-9ec7-43be-9348-505cf5a24218-kube-api-access-6xbs2\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:28:23.343924 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:23.343888 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4" event={"ID":"1b655572-9ec7-43be-9348-505cf5a24218","Type":"ContainerDied","Data":"0e0acf5039dd57fe1e75d1443b34bf19184da31e3c4824d3f53055707925da25"} Apr 22 19:28:23.343924 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:23.343922 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e0acf5039dd57fe1e75d1443b34bf19184da31e3c4824d3f53055707925da25" Apr 22 19:28:23.344164 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:23.343971 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8z8x4" Apr 22 19:28:27.586436 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:27.586402 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rpnc6"] Apr 22 19:28:27.586829 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:27.586654 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b655572-9ec7-43be-9348-505cf5a24218" containerName="pull" Apr 22 19:28:27.586829 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:27.586664 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b655572-9ec7-43be-9348-505cf5a24218" containerName="pull" Apr 22 19:28:27.586829 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:27.586671 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b655572-9ec7-43be-9348-505cf5a24218" containerName="extract" Apr 22 19:28:27.586829 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:27.586677 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b655572-9ec7-43be-9348-505cf5a24218" containerName="extract" Apr 22 19:28:27.586829 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:27.586690 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b655572-9ec7-43be-9348-505cf5a24218" containerName="util" Apr 22 19:28:27.586829 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:27.586696 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b655572-9ec7-43be-9348-505cf5a24218" containerName="util" Apr 22 19:28:27.586829 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:27.586732 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b655572-9ec7-43be-9348-505cf5a24218" containerName="extract" Apr 22 19:28:27.598891 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:27.598866 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rpnc6" Apr 22 19:28:27.601685 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:27.601658 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 19:28:27.601868 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:27.601658 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 19:28:27.601974 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:27.601954 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-xfq9l\"" Apr 22 19:28:27.602185 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:27.602154 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rpnc6"] Apr 22 19:28:27.602275 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:27.602146 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 19:28:27.739835 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:27.739777 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0c8761e6-a87c-450b-aab8-7c064ce5a1fa-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-rpnc6\" (UID: \"0c8761e6-a87c-450b-aab8-7c064ce5a1fa\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rpnc6" Apr 22 19:28:27.739995 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:27.739853 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvv7t\" (UniqueName: \"kubernetes.io/projected/0c8761e6-a87c-450b-aab8-7c064ce5a1fa-kube-api-access-lvv7t\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-rpnc6\" (UID: \"0c8761e6-a87c-450b-aab8-7c064ce5a1fa\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rpnc6" Apr 22 19:28:27.840980 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:27.840900 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0c8761e6-a87c-450b-aab8-7c064ce5a1fa-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-rpnc6\" (UID: \"0c8761e6-a87c-450b-aab8-7c064ce5a1fa\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rpnc6" Apr 22 19:28:27.840980 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:27.840947 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lvv7t\" (UniqueName: \"kubernetes.io/projected/0c8761e6-a87c-450b-aab8-7c064ce5a1fa-kube-api-access-lvv7t\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-rpnc6\" (UID: \"0c8761e6-a87c-450b-aab8-7c064ce5a1fa\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rpnc6" Apr 22 19:28:27.843272 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:27.843246 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0c8761e6-a87c-450b-aab8-7c064ce5a1fa-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-rpnc6\" (UID: \"0c8761e6-a87c-450b-aab8-7c064ce5a1fa\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rpnc6" Apr 22 19:28:27.851754 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:27.851729 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvv7t\" (UniqueName: \"kubernetes.io/projected/0c8761e6-a87c-450b-aab8-7c064ce5a1fa-kube-api-access-lvv7t\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-rpnc6\" (UID: \"0c8761e6-a87c-450b-aab8-7c064ce5a1fa\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rpnc6" Apr 22 19:28:27.910242 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:27.910217 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rpnc6" Apr 22 19:28:28.032006 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:28.031971 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rpnc6"] Apr 22 19:28:28.034707 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:28:28.034675 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c8761e6_a87c_450b_aab8_7c064ce5a1fa.slice/crio-2cd3c628471e3804ad27a01e2d5f2690f7273c592c97c1fe7cf45fdf9dfabf02 WatchSource:0}: Error finding container 2cd3c628471e3804ad27a01e2d5f2690f7273c592c97c1fe7cf45fdf9dfabf02: Status 404 returned error can't find the container with id 2cd3c628471e3804ad27a01e2d5f2690f7273c592c97c1fe7cf45fdf9dfabf02 Apr 22 19:28:28.344202 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:28.344172 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-acl-logging/0.log" Apr 22 19:28:28.344525 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:28.344508 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-acl-logging/0.log" Apr 22 19:28:28.347729 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:28.347707 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 19:28:28.359100 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:28.359072 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rpnc6" event={"ID":"0c8761e6-a87c-450b-aab8-7c064ce5a1fa","Type":"ContainerStarted","Data":"2cd3c628471e3804ad27a01e2d5f2690f7273c592c97c1fe7cf45fdf9dfabf02"} Apr 22 19:28:32.377179 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:32.377081 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rpnc6" event={"ID":"0c8761e6-a87c-450b-aab8-7c064ce5a1fa","Type":"ContainerStarted","Data":"b127d5bc498703d9307ad81adbe841725c127e3485ae4413a1f677ab6462c1b1"} Apr 22 19:28:32.377639 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:32.377241 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rpnc6" Apr 22 19:28:32.398412 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:32.398365 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rpnc6" podStartSLOduration=1.352008096 podStartE2EDuration="5.398351295s" podCreationTimestamp="2026-04-22 19:28:27 +0000 UTC" firstStartedPulling="2026-04-22 19:28:28.036435658 +0000 UTC m=+300.155814948" lastFinishedPulling="2026-04-22 19:28:32.082778858 +0000 UTC m=+304.202158147" observedRunningTime="2026-04-22 19:28:32.397353815 +0000 UTC m=+304.516733124" watchObservedRunningTime="2026-04-22 19:28:32.398351295 +0000 UTC m=+304.517730604" Apr 22 19:28:32.590340 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:32.590298 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-ftth9"] Apr 22 19:28:32.618553 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:32.618524 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-ftth9"] Apr 22 19:28:32.618718 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:32.618651 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-ftth9" Apr 22 19:28:32.621405 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:32.621382 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-hjn5n\"" Apr 22 19:28:32.621538 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:32.621407 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 22 19:28:32.621538 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:32.621417 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 19:28:32.779973 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:32.779939 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/ec651285-d96e-43b8-8319-f3486a54e45b-cabundle0\") pod \"keda-operator-ffbb595cb-ftth9\" (UID: \"ec651285-d96e-43b8-8319-f3486a54e45b\") " pod="openshift-keda/keda-operator-ffbb595cb-ftth9" Apr 22 19:28:32.779973 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:32.779973 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtsvd\" (UniqueName: \"kubernetes.io/projected/ec651285-d96e-43b8-8319-f3486a54e45b-kube-api-access-gtsvd\") pod \"keda-operator-ffbb595cb-ftth9\" (UID: \"ec651285-d96e-43b8-8319-f3486a54e45b\") " pod="openshift-keda/keda-operator-ffbb595cb-ftth9" Apr 22 19:28:32.780186 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:32.780007 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ec651285-d96e-43b8-8319-f3486a54e45b-certificates\") pod \"keda-operator-ffbb595cb-ftth9\" (UID: \"ec651285-d96e-43b8-8319-f3486a54e45b\") " pod="openshift-keda/keda-operator-ffbb595cb-ftth9" Apr 22 19:28:32.794086 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:32.794055 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc"] Apr 22 19:28:32.818720 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:32.818685 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc"] Apr 22 19:28:32.818876 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:32.818825 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc" Apr 22 19:28:32.821472 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:32.821452 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 22 19:28:32.880976 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:32.880943 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ec651285-d96e-43b8-8319-f3486a54e45b-certificates\") pod \"keda-operator-ffbb595cb-ftth9\" (UID: \"ec651285-d96e-43b8-8319-f3486a54e45b\") " pod="openshift-keda/keda-operator-ffbb595cb-ftth9" Apr 22 19:28:32.881156 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:32.881014 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/ec651285-d96e-43b8-8319-f3486a54e45b-cabundle0\") pod \"keda-operator-ffbb595cb-ftth9\" (UID: \"ec651285-d96e-43b8-8319-f3486a54e45b\") " pod="openshift-keda/keda-operator-ffbb595cb-ftth9" Apr 22 19:28:32.881156 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:32.881034 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtsvd\" (UniqueName: \"kubernetes.io/projected/ec651285-d96e-43b8-8319-f3486a54e45b-kube-api-access-gtsvd\") pod \"keda-operator-ffbb595cb-ftth9\" (UID: \"ec651285-d96e-43b8-8319-f3486a54e45b\") " pod="openshift-keda/keda-operator-ffbb595cb-ftth9" Apr 22 19:28:32.881156 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:32.881133 2574 secret.go:281] references non-existent secret key: ca.crt Apr 22 19:28:32.881156 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:32.881154 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 19:28:32.881312 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:32.881167 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-ftth9: references non-existent secret key: ca.crt Apr 22 19:28:32.881312 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:32.881238 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ec651285-d96e-43b8-8319-f3486a54e45b-certificates podName:ec651285-d96e-43b8-8319-f3486a54e45b nodeName:}" failed. No retries permitted until 2026-04-22 19:28:33.381219406 +0000 UTC m=+305.500598696 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ec651285-d96e-43b8-8319-f3486a54e45b-certificates") pod "keda-operator-ffbb595cb-ftth9" (UID: "ec651285-d96e-43b8-8319-f3486a54e45b") : references non-existent secret key: ca.crt Apr 22 19:28:32.881719 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:32.881698 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/ec651285-d96e-43b8-8319-f3486a54e45b-cabundle0\") pod \"keda-operator-ffbb595cb-ftth9\" (UID: \"ec651285-d96e-43b8-8319-f3486a54e45b\") " pod="openshift-keda/keda-operator-ffbb595cb-ftth9" Apr 22 19:28:32.890205 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:32.890177 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtsvd\" (UniqueName: \"kubernetes.io/projected/ec651285-d96e-43b8-8319-f3486a54e45b-kube-api-access-gtsvd\") pod \"keda-operator-ffbb595cb-ftth9\" (UID: \"ec651285-d96e-43b8-8319-f3486a54e45b\") " pod="openshift-keda/keda-operator-ffbb595cb-ftth9" Apr 22 19:28:32.981815 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:32.981762 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg5rb\" (UniqueName: \"kubernetes.io/projected/62a54d18-9e56-4a37-945c-6a6e2bc2c991-kube-api-access-jg5rb\") pod \"keda-metrics-apiserver-7c9f485588-n6dbc\" (UID: \"62a54d18-9e56-4a37-945c-6a6e2bc2c991\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc" Apr 22 19:28:32.981815 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:32.981815 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/62a54d18-9e56-4a37-945c-6a6e2bc2c991-certificates\") pod \"keda-metrics-apiserver-7c9f485588-n6dbc\" (UID: \"62a54d18-9e56-4a37-945c-6a6e2bc2c991\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc" Apr 22 19:28:32.982014 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:32.981884 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/62a54d18-9e56-4a37-945c-6a6e2bc2c991-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-n6dbc\" (UID: \"62a54d18-9e56-4a37-945c-6a6e2bc2c991\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc" Apr 22 19:28:33.069832 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:33.069733 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-hml78"] Apr 22 19:28:33.082710 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:33.082676 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jg5rb\" (UniqueName: \"kubernetes.io/projected/62a54d18-9e56-4a37-945c-6a6e2bc2c991-kube-api-access-jg5rb\") pod \"keda-metrics-apiserver-7c9f485588-n6dbc\" (UID: \"62a54d18-9e56-4a37-945c-6a6e2bc2c991\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc" Apr 22 19:28:33.082710 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:33.082712 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/62a54d18-9e56-4a37-945c-6a6e2bc2c991-certificates\") pod \"keda-metrics-apiserver-7c9f485588-n6dbc\" (UID: \"62a54d18-9e56-4a37-945c-6a6e2bc2c991\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc" Apr 22 19:28:33.082956 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:33.082752 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/62a54d18-9e56-4a37-945c-6a6e2bc2c991-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-n6dbc\" (UID: \"62a54d18-9e56-4a37-945c-6a6e2bc2c991\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc" Apr 22 19:28:33.082956 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:33.082878 2574 secret.go:281] references non-existent secret key: tls.crt Apr 22 19:28:33.082956 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:33.082894 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 19:28:33.082956 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:33.082911 2574 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 22 19:28:33.082956 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:33.082932 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 22 19:28:33.083192 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:33.082991 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/62a54d18-9e56-4a37-945c-6a6e2bc2c991-certificates podName:62a54d18-9e56-4a37-945c-6a6e2bc2c991 nodeName:}" failed. No retries permitted until 2026-04-22 19:28:33.582971454 +0000 UTC m=+305.702350767 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/62a54d18-9e56-4a37-945c-6a6e2bc2c991-certificates") pod "keda-metrics-apiserver-7c9f485588-n6dbc" (UID: "62a54d18-9e56-4a37-945c-6a6e2bc2c991") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 22 19:28:33.083192 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:33.083161 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/62a54d18-9e56-4a37-945c-6a6e2bc2c991-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-n6dbc\" (UID: \"62a54d18-9e56-4a37-945c-6a6e2bc2c991\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc" Apr 22 19:28:33.093123 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:33.093091 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-hml78"] Apr 22 19:28:33.093258 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:33.093209 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-hml78" Apr 22 19:28:33.095973 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:33.095939 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 22 19:28:33.099626 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:33.099599 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg5rb\" (UniqueName: \"kubernetes.io/projected/62a54d18-9e56-4a37-945c-6a6e2bc2c991-kube-api-access-jg5rb\") pod \"keda-metrics-apiserver-7c9f485588-n6dbc\" (UID: \"62a54d18-9e56-4a37-945c-6a6e2bc2c991\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc" Apr 22 19:28:33.183516 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:33.183474 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c5d5af68-0427-4d12-80d1-f5f99c64489d-certificates\") pod \"keda-admission-cf49989db-hml78\" (UID: \"c5d5af68-0427-4d12-80d1-f5f99c64489d\") " pod="openshift-keda/keda-admission-cf49989db-hml78" Apr 22 19:28:33.183702 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:33.183588 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnr87\" (UniqueName: \"kubernetes.io/projected/c5d5af68-0427-4d12-80d1-f5f99c64489d-kube-api-access-bnr87\") pod \"keda-admission-cf49989db-hml78\" (UID: \"c5d5af68-0427-4d12-80d1-f5f99c64489d\") " pod="openshift-keda/keda-admission-cf49989db-hml78" Apr 22 19:28:33.284201 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:33.284161 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bnr87\" (UniqueName: \"kubernetes.io/projected/c5d5af68-0427-4d12-80d1-f5f99c64489d-kube-api-access-bnr87\") pod \"keda-admission-cf49989db-hml78\" (UID: \"c5d5af68-0427-4d12-80d1-f5f99c64489d\") " pod="openshift-keda/keda-admission-cf49989db-hml78" Apr 22 19:28:33.284370 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:33.284210 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c5d5af68-0427-4d12-80d1-f5f99c64489d-certificates\") pod \"keda-admission-cf49989db-hml78\" (UID: \"c5d5af68-0427-4d12-80d1-f5f99c64489d\") " pod="openshift-keda/keda-admission-cf49989db-hml78" Apr 22 19:28:33.286694 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:33.286674 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c5d5af68-0427-4d12-80d1-f5f99c64489d-certificates\") pod \"keda-admission-cf49989db-hml78\" (UID: \"c5d5af68-0427-4d12-80d1-f5f99c64489d\") " pod="openshift-keda/keda-admission-cf49989db-hml78" Apr 22 19:28:33.292096 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:33.292076 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnr87\" (UniqueName: \"kubernetes.io/projected/c5d5af68-0427-4d12-80d1-f5f99c64489d-kube-api-access-bnr87\") pod \"keda-admission-cf49989db-hml78\" (UID: \"c5d5af68-0427-4d12-80d1-f5f99c64489d\") " pod="openshift-keda/keda-admission-cf49989db-hml78" Apr 22 19:28:33.384836 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:33.384731 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ec651285-d96e-43b8-8319-f3486a54e45b-certificates\") pod \"keda-operator-ffbb595cb-ftth9\" (UID: \"ec651285-d96e-43b8-8319-f3486a54e45b\") " pod="openshift-keda/keda-operator-ffbb595cb-ftth9" Apr 22 19:28:33.385199 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:33.384885 2574 secret.go:281] references non-existent secret key: ca.crt Apr 22 19:28:33.385199 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:33.384904 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 19:28:33.385199 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:33.384913 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-ftth9: references non-existent secret key: ca.crt Apr 22 19:28:33.385199 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:33.384961 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ec651285-d96e-43b8-8319-f3486a54e45b-certificates podName:ec651285-d96e-43b8-8319-f3486a54e45b nodeName:}" failed. No retries permitted until 2026-04-22 19:28:34.384948468 +0000 UTC m=+306.504327754 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ec651285-d96e-43b8-8319-f3486a54e45b-certificates") pod "keda-operator-ffbb595cb-ftth9" (UID: "ec651285-d96e-43b8-8319-f3486a54e45b") : references non-existent secret key: ca.crt Apr 22 19:28:33.412934 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:33.412897 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-hml78" Apr 22 19:28:33.533303 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:33.533241 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-hml78"] Apr 22 19:28:33.536005 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:28:33.535976 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5d5af68_0427_4d12_80d1_f5f99c64489d.slice/crio-e102a95842dca643340338bf9232f4e2b7f5062cddaf9c38e1bfaa4e3e939479 WatchSource:0}: Error finding container e102a95842dca643340338bf9232f4e2b7f5062cddaf9c38e1bfaa4e3e939479: Status 404 returned error can't find the container with id e102a95842dca643340338bf9232f4e2b7f5062cddaf9c38e1bfaa4e3e939479 Apr 22 19:28:33.537252 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:33.537235 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:28:33.585986 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:33.585957 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/62a54d18-9e56-4a37-945c-6a6e2bc2c991-certificates\") pod \"keda-metrics-apiserver-7c9f485588-n6dbc\" (UID: \"62a54d18-9e56-4a37-945c-6a6e2bc2c991\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc" Apr 22 19:28:33.586129 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:33.586064 2574 secret.go:281] references non-existent secret key: tls.crt Apr 22 19:28:33.586129 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:33.586075 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 19:28:33.586129 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:33.586091 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc: references non-existent secret key: tls.crt Apr 22 19:28:33.586228 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:33.586157 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/62a54d18-9e56-4a37-945c-6a6e2bc2c991-certificates podName:62a54d18-9e56-4a37-945c-6a6e2bc2c991 nodeName:}" failed. No retries permitted until 2026-04-22 19:28:34.586144045 +0000 UTC m=+306.705523332 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/62a54d18-9e56-4a37-945c-6a6e2bc2c991-certificates") pod "keda-metrics-apiserver-7c9f485588-n6dbc" (UID: "62a54d18-9e56-4a37-945c-6a6e2bc2c991") : references non-existent secret key: tls.crt Apr 22 19:28:34.384521 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:34.384491 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-hml78" event={"ID":"c5d5af68-0427-4d12-80d1-f5f99c64489d","Type":"ContainerStarted","Data":"e102a95842dca643340338bf9232f4e2b7f5062cddaf9c38e1bfaa4e3e939479"} Apr 22 19:28:34.391904 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:34.391862 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ec651285-d96e-43b8-8319-f3486a54e45b-certificates\") pod \"keda-operator-ffbb595cb-ftth9\" (UID: \"ec651285-d96e-43b8-8319-f3486a54e45b\") " pod="openshift-keda/keda-operator-ffbb595cb-ftth9" Apr 22 19:28:34.392284 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:34.392016 2574 secret.go:281] references non-existent secret key: ca.crt Apr 22 19:28:34.392284 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:34.392037 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 19:28:34.392284 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:34.392050 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-ftth9: references non-existent secret key: ca.crt Apr 22 19:28:34.392284 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:34.392107 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ec651285-d96e-43b8-8319-f3486a54e45b-certificates podName:ec651285-d96e-43b8-8319-f3486a54e45b nodeName:}" failed. No retries permitted until 2026-04-22 19:28:36.392088692 +0000 UTC m=+308.511467996 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ec651285-d96e-43b8-8319-f3486a54e45b-certificates") pod "keda-operator-ffbb595cb-ftth9" (UID: "ec651285-d96e-43b8-8319-f3486a54e45b") : references non-existent secret key: ca.crt Apr 22 19:28:34.593915 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:34.593877 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/62a54d18-9e56-4a37-945c-6a6e2bc2c991-certificates\") pod \"keda-metrics-apiserver-7c9f485588-n6dbc\" (UID: \"62a54d18-9e56-4a37-945c-6a6e2bc2c991\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc" Apr 22 19:28:34.594087 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:34.594015 2574 secret.go:281] references non-existent secret key: tls.crt Apr 22 19:28:34.594087 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:34.594028 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 19:28:34.594087 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:34.594045 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc: references non-existent secret key: tls.crt Apr 22 19:28:34.594191 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:34.594091 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/62a54d18-9e56-4a37-945c-6a6e2bc2c991-certificates podName:62a54d18-9e56-4a37-945c-6a6e2bc2c991 nodeName:}" failed. No retries permitted until 2026-04-22 19:28:36.59407753 +0000 UTC m=+308.713456818 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/62a54d18-9e56-4a37-945c-6a6e2bc2c991-certificates") pod "keda-metrics-apiserver-7c9f485588-n6dbc" (UID: "62a54d18-9e56-4a37-945c-6a6e2bc2c991") : references non-existent secret key: tls.crt Apr 22 19:28:35.388225 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:35.388145 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-hml78" event={"ID":"c5d5af68-0427-4d12-80d1-f5f99c64489d","Type":"ContainerStarted","Data":"d4fcec491441bea1fd517c2732da98bae337e3015aa5ef42b78658d6aa60ba87"} Apr 22 19:28:35.388382 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:35.388276 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-hml78" Apr 22 19:28:35.404929 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:35.404878 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-hml78" podStartSLOduration=0.890757058 podStartE2EDuration="2.404861106s" podCreationTimestamp="2026-04-22 19:28:33 +0000 UTC" firstStartedPulling="2026-04-22 19:28:33.537360432 +0000 UTC m=+305.656739719" lastFinishedPulling="2026-04-22 19:28:35.051464479 +0000 UTC m=+307.170843767" observedRunningTime="2026-04-22 19:28:35.403942576 +0000 UTC m=+307.523321888" watchObservedRunningTime="2026-04-22 19:28:35.404861106 +0000 UTC m=+307.524240417" Apr 22 19:28:36.413121 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:36.413090 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ec651285-d96e-43b8-8319-f3486a54e45b-certificates\") pod \"keda-operator-ffbb595cb-ftth9\" (UID: \"ec651285-d96e-43b8-8319-f3486a54e45b\") " pod="openshift-keda/keda-operator-ffbb595cb-ftth9" Apr 22 19:28:36.413496 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:36.413236 2574 secret.go:281] references non-existent secret key: ca.crt Apr 22 19:28:36.413496 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:36.413251 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 19:28:36.413496 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:36.413260 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-ftth9: references non-existent secret key: ca.crt Apr 22 19:28:36.413496 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:36.413312 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ec651285-d96e-43b8-8319-f3486a54e45b-certificates podName:ec651285-d96e-43b8-8319-f3486a54e45b nodeName:}" failed. No retries permitted until 2026-04-22 19:28:40.413295275 +0000 UTC m=+312.532674563 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ec651285-d96e-43b8-8319-f3486a54e45b-certificates") pod "keda-operator-ffbb595cb-ftth9" (UID: "ec651285-d96e-43b8-8319-f3486a54e45b") : references non-existent secret key: ca.crt Apr 22 19:28:36.615124 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:36.615090 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/62a54d18-9e56-4a37-945c-6a6e2bc2c991-certificates\") pod \"keda-metrics-apiserver-7c9f485588-n6dbc\" (UID: \"62a54d18-9e56-4a37-945c-6a6e2bc2c991\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc" Apr 22 19:28:36.615293 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:36.615230 2574 secret.go:281] references non-existent secret key: tls.crt Apr 22 19:28:36.615293 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:36.615252 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 19:28:36.615293 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:36.615271 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc: references non-existent secret key: tls.crt Apr 22 19:28:36.615411 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:28:36.615331 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/62a54d18-9e56-4a37-945c-6a6e2bc2c991-certificates podName:62a54d18-9e56-4a37-945c-6a6e2bc2c991 nodeName:}" failed. No retries permitted until 2026-04-22 19:28:40.615314202 +0000 UTC m=+312.734693488 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/62a54d18-9e56-4a37-945c-6a6e2bc2c991-certificates") pod "keda-metrics-apiserver-7c9f485588-n6dbc" (UID: "62a54d18-9e56-4a37-945c-6a6e2bc2c991") : references non-existent secret key: tls.crt Apr 22 19:28:40.461300 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:40.461259 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ec651285-d96e-43b8-8319-f3486a54e45b-certificates\") pod \"keda-operator-ffbb595cb-ftth9\" (UID: \"ec651285-d96e-43b8-8319-f3486a54e45b\") " pod="openshift-keda/keda-operator-ffbb595cb-ftth9" Apr 22 19:28:40.463665 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:40.463643 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ec651285-d96e-43b8-8319-f3486a54e45b-certificates\") pod \"keda-operator-ffbb595cb-ftth9\" (UID: \"ec651285-d96e-43b8-8319-f3486a54e45b\") " pod="openshift-keda/keda-operator-ffbb595cb-ftth9" Apr 22 19:28:40.662878 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:40.662838 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/62a54d18-9e56-4a37-945c-6a6e2bc2c991-certificates\") pod \"keda-metrics-apiserver-7c9f485588-n6dbc\" (UID: \"62a54d18-9e56-4a37-945c-6a6e2bc2c991\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc" Apr 22 19:28:40.665376 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:40.665355 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/62a54d18-9e56-4a37-945c-6a6e2bc2c991-certificates\") pod \"keda-metrics-apiserver-7c9f485588-n6dbc\" (UID: \"62a54d18-9e56-4a37-945c-6a6e2bc2c991\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc" Apr 22 19:28:40.728286 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:40.728195 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-ftth9" Apr 22 19:28:40.845260 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:40.845226 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-ftth9"] Apr 22 19:28:40.849268 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:28:40.849237 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec651285_d96e_43b8_8319_f3486a54e45b.slice/crio-6f10b6197852299c09b281ad9a2152c3a3d68c186a8125b9b8d1d830854d9ff0 WatchSource:0}: Error finding container 6f10b6197852299c09b281ad9a2152c3a3d68c186a8125b9b8d1d830854d9ff0: Status 404 returned error can't find the container with id 6f10b6197852299c09b281ad9a2152c3a3d68c186a8125b9b8d1d830854d9ff0 Apr 22 19:28:40.929317 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:40.929284 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc" Apr 22 19:28:41.044008 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:41.043864 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc"] Apr 22 19:28:41.046368 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:28:41.046338 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62a54d18_9e56_4a37_945c_6a6e2bc2c991.slice/crio-c08459dacfb24717625508a735d06024473a658e9e0b66aae75404c7d1a244e4 WatchSource:0}: Error finding container c08459dacfb24717625508a735d06024473a658e9e0b66aae75404c7d1a244e4: Status 404 returned error can't find the container with id c08459dacfb24717625508a735d06024473a658e9e0b66aae75404c7d1a244e4 Apr 22 19:28:41.403974 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:41.403882 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc" event={"ID":"62a54d18-9e56-4a37-945c-6a6e2bc2c991","Type":"ContainerStarted","Data":"c08459dacfb24717625508a735d06024473a658e9e0b66aae75404c7d1a244e4"} Apr 22 19:28:41.404714 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:41.404692 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-ftth9" event={"ID":"ec651285-d96e-43b8-8319-f3486a54e45b","Type":"ContainerStarted","Data":"6f10b6197852299c09b281ad9a2152c3a3d68c186a8125b9b8d1d830854d9ff0"} Apr 22 19:28:45.419277 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:45.419236 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc" event={"ID":"62a54d18-9e56-4a37-945c-6a6e2bc2c991","Type":"ContainerStarted","Data":"161f6a38c1b23acee2774c2162211c2ea61b043de2e1e015210745c339592c19"} Apr 22 19:28:45.419727 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:45.419324 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc" Apr 22 19:28:45.420615 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:45.420587 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-ftth9" event={"ID":"ec651285-d96e-43b8-8319-f3486a54e45b","Type":"ContainerStarted","Data":"d46b58bc6749b110cb7f09c6208f07dfb042d4aaac641a742e96e984c393c081"} Apr 22 19:28:45.420732 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:45.420643 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-ftth9" Apr 22 19:28:45.437720 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:45.437677 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc" podStartSLOduration=9.582968329 podStartE2EDuration="13.437664279s" podCreationTimestamp="2026-04-22 19:28:32 +0000 UTC" firstStartedPulling="2026-04-22 19:28:41.047745256 +0000 UTC m=+313.167124543" lastFinishedPulling="2026-04-22 19:28:44.902441206 +0000 UTC m=+317.021820493" observedRunningTime="2026-04-22 19:28:45.43606333 +0000 UTC m=+317.555442638" watchObservedRunningTime="2026-04-22 19:28:45.437664279 +0000 UTC m=+317.557043587" Apr 22 19:28:45.454865 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:45.454818 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-ftth9" podStartSLOduration=9.397758651 podStartE2EDuration="13.454787967s" podCreationTimestamp="2026-04-22 19:28:32 +0000 UTC" firstStartedPulling="2026-04-22 19:28:40.850548044 +0000 UTC m=+312.969927331" lastFinishedPulling="2026-04-22 19:28:44.90757736 +0000 UTC m=+317.026956647" observedRunningTime="2026-04-22 19:28:45.453409791 +0000 UTC m=+317.572789120" watchObservedRunningTime="2026-04-22 19:28:45.454787967 +0000 UTC m=+317.574167276" Apr 22 19:28:53.382914 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:53.382880 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rpnc6" Apr 22 19:28:56.393502 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:56.393475 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-hml78" Apr 22 19:28:56.430535 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:28:56.430508 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-n6dbc" Apr 22 19:29:06.428902 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:06.428869 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-ftth9" Apr 22 19:29:39.487705 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.487625 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-z88zq"] Apr 22 19:29:39.496572 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.496543 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-vgt72"] Apr 22 19:29:39.496572 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.496573 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-545d8995fb-z88zq" Apr 22 19:29:39.499555 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.499499 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 19:29:39.499717 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.499694 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgt72" Apr 22 19:29:39.500898 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.500876 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 19:29:39.501029 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.500877 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 19:29:39.501029 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.500903 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-p8g2w\"" Apr 22 19:29:39.502250 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.502227 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 19:29:39.502863 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.502843 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-h47cp\"" Apr 22 19:29:39.504486 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.504464 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-z88zq"] Apr 22 19:29:39.511874 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.511852 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-vgt72"] Apr 22 19:29:39.531947 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.531921 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-hmzs7"] Apr 22 19:29:39.535359 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.535343 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-hmzs7" Apr 22 19:29:39.538060 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.538040 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 19:29:39.538060 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.538050 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-6hbj8\"" Apr 22 19:29:39.547216 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.547191 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-hmzs7"] Apr 22 19:29:39.600822 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.600767 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d2c9328-9cf3-4d9a-8f04-6a87b72db36e-cert\") pod \"kserve-controller-manager-545d8995fb-z88zq\" (UID: \"5d2c9328-9cf3-4d9a-8f04-6a87b72db36e\") " pod="kserve/kserve-controller-manager-545d8995fb-z88zq" Apr 22 19:29:39.600994 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.600843 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb95b\" (UniqueName: \"kubernetes.io/projected/d3a161b2-9b3a-40d3-8e67-dd6f929f7713-kube-api-access-cb95b\") pod \"llmisvc-controller-manager-68cc5db7c4-vgt72\" (UID: \"d3a161b2-9b3a-40d3-8e67-dd6f929f7713\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgt72" Apr 22 19:29:39.600994 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.600900 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3a161b2-9b3a-40d3-8e67-dd6f929f7713-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-vgt72\" (UID: \"d3a161b2-9b3a-40d3-8e67-dd6f929f7713\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgt72" Apr 22 19:29:39.600994 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.600942 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fz9k\" (UniqueName: \"kubernetes.io/projected/a82ba81b-2e61-49b3-b439-16a1541e4352-kube-api-access-8fz9k\") pod \"seaweedfs-86cc847c5c-hmzs7\" (UID: \"a82ba81b-2e61-49b3-b439-16a1541e4352\") " pod="kserve/seaweedfs-86cc847c5c-hmzs7" Apr 22 19:29:39.601113 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.600994 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a82ba81b-2e61-49b3-b439-16a1541e4352-data\") pod \"seaweedfs-86cc847c5c-hmzs7\" (UID: \"a82ba81b-2e61-49b3-b439-16a1541e4352\") " pod="kserve/seaweedfs-86cc847c5c-hmzs7" Apr 22 19:29:39.601113 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.601022 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52q85\" (UniqueName: \"kubernetes.io/projected/5d2c9328-9cf3-4d9a-8f04-6a87b72db36e-kube-api-access-52q85\") pod \"kserve-controller-manager-545d8995fb-z88zq\" (UID: \"5d2c9328-9cf3-4d9a-8f04-6a87b72db36e\") " pod="kserve/kserve-controller-manager-545d8995fb-z88zq" Apr 22 19:29:39.701886 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.701841 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cb95b\" (UniqueName: \"kubernetes.io/projected/d3a161b2-9b3a-40d3-8e67-dd6f929f7713-kube-api-access-cb95b\") pod \"llmisvc-controller-manager-68cc5db7c4-vgt72\" (UID: \"d3a161b2-9b3a-40d3-8e67-dd6f929f7713\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgt72" Apr 22 19:29:39.702068 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.701909 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3a161b2-9b3a-40d3-8e67-dd6f929f7713-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-vgt72\" (UID: \"d3a161b2-9b3a-40d3-8e67-dd6f929f7713\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgt72" Apr 22 19:29:39.702068 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.701936 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fz9k\" (UniqueName: \"kubernetes.io/projected/a82ba81b-2e61-49b3-b439-16a1541e4352-kube-api-access-8fz9k\") pod \"seaweedfs-86cc847c5c-hmzs7\" (UID: \"a82ba81b-2e61-49b3-b439-16a1541e4352\") " pod="kserve/seaweedfs-86cc847c5c-hmzs7" Apr 22 19:29:39.702068 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.701968 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a82ba81b-2e61-49b3-b439-16a1541e4352-data\") pod \"seaweedfs-86cc847c5c-hmzs7\" (UID: \"a82ba81b-2e61-49b3-b439-16a1541e4352\") " pod="kserve/seaweedfs-86cc847c5c-hmzs7" Apr 22 19:29:39.702068 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.702046 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52q85\" (UniqueName: \"kubernetes.io/projected/5d2c9328-9cf3-4d9a-8f04-6a87b72db36e-kube-api-access-52q85\") pod \"kserve-controller-manager-545d8995fb-z88zq\" (UID: \"5d2c9328-9cf3-4d9a-8f04-6a87b72db36e\") " pod="kserve/kserve-controller-manager-545d8995fb-z88zq" Apr 22 19:29:39.702351 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:29:39.702054 2574 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 22 19:29:39.702351 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.702110 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d2c9328-9cf3-4d9a-8f04-6a87b72db36e-cert\") pod \"kserve-controller-manager-545d8995fb-z88zq\" (UID: \"5d2c9328-9cf3-4d9a-8f04-6a87b72db36e\") " pod="kserve/kserve-controller-manager-545d8995fb-z88zq" Apr 22 19:29:39.702351 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:29:39.702188 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3a161b2-9b3a-40d3-8e67-dd6f929f7713-cert podName:d3a161b2-9b3a-40d3-8e67-dd6f929f7713 nodeName:}" failed. No retries permitted until 2026-04-22 19:29:40.202171486 +0000 UTC m=+372.321550773 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d3a161b2-9b3a-40d3-8e67-dd6f929f7713-cert") pod "llmisvc-controller-manager-68cc5db7c4-vgt72" (UID: "d3a161b2-9b3a-40d3-8e67-dd6f929f7713") : secret "llmisvc-webhook-server-cert" not found Apr 22 19:29:39.702507 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.702448 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a82ba81b-2e61-49b3-b439-16a1541e4352-data\") pod \"seaweedfs-86cc847c5c-hmzs7\" (UID: \"a82ba81b-2e61-49b3-b439-16a1541e4352\") " pod="kserve/seaweedfs-86cc847c5c-hmzs7" Apr 22 19:29:39.704689 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.704661 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d2c9328-9cf3-4d9a-8f04-6a87b72db36e-cert\") pod \"kserve-controller-manager-545d8995fb-z88zq\" (UID: \"5d2c9328-9cf3-4d9a-8f04-6a87b72db36e\") " pod="kserve/kserve-controller-manager-545d8995fb-z88zq" Apr 22 19:29:39.712294 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.712269 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52q85\" (UniqueName: \"kubernetes.io/projected/5d2c9328-9cf3-4d9a-8f04-6a87b72db36e-kube-api-access-52q85\") pod \"kserve-controller-manager-545d8995fb-z88zq\" (UID: \"5d2c9328-9cf3-4d9a-8f04-6a87b72db36e\") " pod="kserve/kserve-controller-manager-545d8995fb-z88zq" Apr 22 19:29:39.712294 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.712283 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb95b\" (UniqueName: \"kubernetes.io/projected/d3a161b2-9b3a-40d3-8e67-dd6f929f7713-kube-api-access-cb95b\") pod \"llmisvc-controller-manager-68cc5db7c4-vgt72\" (UID: \"d3a161b2-9b3a-40d3-8e67-dd6f929f7713\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgt72" Apr 22 19:29:39.714249 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.714232 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fz9k\" (UniqueName: \"kubernetes.io/projected/a82ba81b-2e61-49b3-b439-16a1541e4352-kube-api-access-8fz9k\") pod \"seaweedfs-86cc847c5c-hmzs7\" (UID: \"a82ba81b-2e61-49b3-b439-16a1541e4352\") " pod="kserve/seaweedfs-86cc847c5c-hmzs7" Apr 22 19:29:39.809605 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.809574 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-545d8995fb-z88zq" Apr 22 19:29:39.845854 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.845819 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-hmzs7" Apr 22 19:29:39.944178 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.944155 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-z88zq"] Apr 22 19:29:39.946655 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:29:39.946618 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d2c9328_9cf3_4d9a_8f04_6a87b72db36e.slice/crio-9cb549a7263a889d90a51fbe03d3b9693bd3bf60e5a2932518dccb1a6dcc0a5f WatchSource:0}: Error finding container 9cb549a7263a889d90a51fbe03d3b9693bd3bf60e5a2932518dccb1a6dcc0a5f: Status 404 returned error can't find the container with id 9cb549a7263a889d90a51fbe03d3b9693bd3bf60e5a2932518dccb1a6dcc0a5f Apr 22 19:29:39.982362 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:39.982338 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-hmzs7"] Apr 22 19:29:39.984621 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:29:39.984592 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda82ba81b_2e61_49b3_b439_16a1541e4352.slice/crio-2faff7b013f7b0710eed1ce175989dc115635d4bfcb1b016e75b8583bf4452fc WatchSource:0}: Error finding container 2faff7b013f7b0710eed1ce175989dc115635d4bfcb1b016e75b8583bf4452fc: Status 404 returned error can't find the container with id 2faff7b013f7b0710eed1ce175989dc115635d4bfcb1b016e75b8583bf4452fc Apr 22 19:29:40.208016 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:40.207937 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3a161b2-9b3a-40d3-8e67-dd6f929f7713-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-vgt72\" (UID: \"d3a161b2-9b3a-40d3-8e67-dd6f929f7713\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgt72" Apr 22 19:29:40.210382 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:40.210353 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3a161b2-9b3a-40d3-8e67-dd6f929f7713-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-vgt72\" (UID: \"d3a161b2-9b3a-40d3-8e67-dd6f929f7713\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgt72" Apr 22 19:29:40.415975 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:40.415937 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgt72" Apr 22 19:29:40.611821 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:40.611763 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-hmzs7" event={"ID":"a82ba81b-2e61-49b3-b439-16a1541e4352","Type":"ContainerStarted","Data":"2faff7b013f7b0710eed1ce175989dc115635d4bfcb1b016e75b8583bf4452fc"} Apr 22 19:29:40.613142 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:40.613115 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-545d8995fb-z88zq" event={"ID":"5d2c9328-9cf3-4d9a-8f04-6a87b72db36e","Type":"ContainerStarted","Data":"9cb549a7263a889d90a51fbe03d3b9693bd3bf60e5a2932518dccb1a6dcc0a5f"} Apr 22 19:29:40.680189 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:40.680154 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-vgt72"] Apr 22 19:29:40.684324 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:29:40.684286 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd3a161b2_9b3a_40d3_8e67_dd6f929f7713.slice/crio-a6e65d27d5718f939f7d5842376a8c1eaf0bbaa0143d1f633bda80d23a7ed542 WatchSource:0}: Error finding container a6e65d27d5718f939f7d5842376a8c1eaf0bbaa0143d1f633bda80d23a7ed542: Status 404 returned error can't find the container with id a6e65d27d5718f939f7d5842376a8c1eaf0bbaa0143d1f633bda80d23a7ed542 Apr 22 19:29:41.618320 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:41.618277 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgt72" event={"ID":"d3a161b2-9b3a-40d3-8e67-dd6f929f7713","Type":"ContainerStarted","Data":"a6e65d27d5718f939f7d5842376a8c1eaf0bbaa0143d1f633bda80d23a7ed542"} Apr 22 19:29:45.636878 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:45.636836 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-hmzs7" event={"ID":"a82ba81b-2e61-49b3-b439-16a1541e4352","Type":"ContainerStarted","Data":"1401b74e933d96e0c3362de05ee0dae658c576ad54a89971ddc5b423f30c2479"} Apr 22 19:29:45.637337 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:45.637060 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-hmzs7" Apr 22 19:29:45.638206 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:45.638185 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgt72" event={"ID":"d3a161b2-9b3a-40d3-8e67-dd6f929f7713","Type":"ContainerStarted","Data":"1f21f47199f3f529adbe81dde3d2151bcf42531c9e4694747e7eba87fb673562"} Apr 22 19:29:45.638326 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:45.638307 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgt72" Apr 22 19:29:45.639445 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:45.639425 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-545d8995fb-z88zq" event={"ID":"5d2c9328-9cf3-4d9a-8f04-6a87b72db36e","Type":"ContainerStarted","Data":"bc98636cc20bd7be0d79230cdf4842de10a7edcc236532b024b7c895ba5f11ae"} Apr 22 19:29:45.639542 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:45.639527 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-545d8995fb-z88zq" Apr 22 19:29:45.653235 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:45.653197 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-hmzs7" podStartSLOduration=1.7613378119999998 podStartE2EDuration="6.653184627s" podCreationTimestamp="2026-04-22 19:29:39 +0000 UTC" firstStartedPulling="2026-04-22 19:29:39.985876802 +0000 UTC m=+372.105256089" lastFinishedPulling="2026-04-22 19:29:44.877723604 +0000 UTC m=+376.997102904" observedRunningTime="2026-04-22 19:29:45.65194447 +0000 UTC m=+377.771323783" watchObservedRunningTime="2026-04-22 19:29:45.653184627 +0000 UTC m=+377.772563938" Apr 22 19:29:45.667863 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:45.667795 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgt72" podStartSLOduration=2.425083384 podStartE2EDuration="6.667782978s" podCreationTimestamp="2026-04-22 19:29:39 +0000 UTC" firstStartedPulling="2026-04-22 19:29:40.686162345 +0000 UTC m=+372.805541637" lastFinishedPulling="2026-04-22 19:29:44.928861931 +0000 UTC m=+377.048241231" observedRunningTime="2026-04-22 19:29:45.666214291 +0000 UTC m=+377.785593603" watchObservedRunningTime="2026-04-22 19:29:45.667782978 +0000 UTC m=+377.787162315" Apr 22 19:29:45.681424 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:45.681377 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-545d8995fb-z88zq" podStartSLOduration=2.478432426 podStartE2EDuration="6.681363416s" podCreationTimestamp="2026-04-22 19:29:39 +0000 UTC" firstStartedPulling="2026-04-22 19:29:39.948289724 +0000 UTC m=+372.067669012" lastFinishedPulling="2026-04-22 19:29:44.151220715 +0000 UTC m=+376.270600002" observedRunningTime="2026-04-22 19:29:45.68090441 +0000 UTC m=+377.800283721" watchObservedRunningTime="2026-04-22 19:29:45.681363416 +0000 UTC m=+377.800742725" Apr 22 19:29:51.645201 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:29:51.645165 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-hmzs7" Apr 22 19:30:09.674436 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:09.674390 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c688944ff-648w2"] Apr 22 19:30:16.645136 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:16.645108 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vgt72" Apr 22 19:30:16.648024 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:16.648004 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-545d8995fb-z88zq" Apr 22 19:30:18.276245 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.276216 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-z88zq"] Apr 22 19:30:18.276648 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.276405 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-545d8995fb-z88zq" podUID="5d2c9328-9cf3-4d9a-8f04-6a87b72db36e" containerName="manager" containerID="cri-o://bc98636cc20bd7be0d79230cdf4842de10a7edcc236532b024b7c895ba5f11ae" gracePeriod=10 Apr 22 19:30:18.296431 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.296402 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-w2j7l"] Apr 22 19:30:18.311849 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.311793 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-w2j7l"] Apr 22 19:30:18.311949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.311940 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-545d8995fb-w2j7l" Apr 22 19:30:18.403253 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.403216 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxvbq\" (UniqueName: \"kubernetes.io/projected/c1f8dc15-45b3-4be9-baf3-9d624a993e5f-kube-api-access-wxvbq\") pod \"kserve-controller-manager-545d8995fb-w2j7l\" (UID: \"c1f8dc15-45b3-4be9-baf3-9d624a993e5f\") " pod="kserve/kserve-controller-manager-545d8995fb-w2j7l" Apr 22 19:30:18.403414 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.403262 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1f8dc15-45b3-4be9-baf3-9d624a993e5f-cert\") pod \"kserve-controller-manager-545d8995fb-w2j7l\" (UID: \"c1f8dc15-45b3-4be9-baf3-9d624a993e5f\") " pod="kserve/kserve-controller-manager-545d8995fb-w2j7l" Apr 22 19:30:18.504383 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.504349 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxvbq\" (UniqueName: \"kubernetes.io/projected/c1f8dc15-45b3-4be9-baf3-9d624a993e5f-kube-api-access-wxvbq\") pod \"kserve-controller-manager-545d8995fb-w2j7l\" (UID: \"c1f8dc15-45b3-4be9-baf3-9d624a993e5f\") " pod="kserve/kserve-controller-manager-545d8995fb-w2j7l" Apr 22 19:30:18.504561 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.504407 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1f8dc15-45b3-4be9-baf3-9d624a993e5f-cert\") pod \"kserve-controller-manager-545d8995fb-w2j7l\" (UID: \"c1f8dc15-45b3-4be9-baf3-9d624a993e5f\") " pod="kserve/kserve-controller-manager-545d8995fb-w2j7l" Apr 22 19:30:18.507017 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.506991 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1f8dc15-45b3-4be9-baf3-9d624a993e5f-cert\") pod \"kserve-controller-manager-545d8995fb-w2j7l\" (UID: \"c1f8dc15-45b3-4be9-baf3-9d624a993e5f\") " pod="kserve/kserve-controller-manager-545d8995fb-w2j7l" Apr 22 19:30:18.513968 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.513939 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxvbq\" (UniqueName: \"kubernetes.io/projected/c1f8dc15-45b3-4be9-baf3-9d624a993e5f-kube-api-access-wxvbq\") pod \"kserve-controller-manager-545d8995fb-w2j7l\" (UID: \"c1f8dc15-45b3-4be9-baf3-9d624a993e5f\") " pod="kserve/kserve-controller-manager-545d8995fb-w2j7l" Apr 22 19:30:18.541846 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.541821 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-545d8995fb-z88zq" Apr 22 19:30:18.605529 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.605498 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52q85\" (UniqueName: \"kubernetes.io/projected/5d2c9328-9cf3-4d9a-8f04-6a87b72db36e-kube-api-access-52q85\") pod \"5d2c9328-9cf3-4d9a-8f04-6a87b72db36e\" (UID: \"5d2c9328-9cf3-4d9a-8f04-6a87b72db36e\") " Apr 22 19:30:18.605700 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.605539 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d2c9328-9cf3-4d9a-8f04-6a87b72db36e-cert\") pod \"5d2c9328-9cf3-4d9a-8f04-6a87b72db36e\" (UID: \"5d2c9328-9cf3-4d9a-8f04-6a87b72db36e\") " Apr 22 19:30:18.607693 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.607665 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d2c9328-9cf3-4d9a-8f04-6a87b72db36e-cert" (OuterVolumeSpecName: "cert") pod "5d2c9328-9cf3-4d9a-8f04-6a87b72db36e" (UID: "5d2c9328-9cf3-4d9a-8f04-6a87b72db36e"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:30:18.607822 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.607716 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d2c9328-9cf3-4d9a-8f04-6a87b72db36e-kube-api-access-52q85" (OuterVolumeSpecName: "kube-api-access-52q85") pod "5d2c9328-9cf3-4d9a-8f04-6a87b72db36e" (UID: "5d2c9328-9cf3-4d9a-8f04-6a87b72db36e"). InnerVolumeSpecName "kube-api-access-52q85". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:30:18.669456 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.669424 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-545d8995fb-w2j7l" Apr 22 19:30:18.706864 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.706829 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-52q85\" (UniqueName: \"kubernetes.io/projected/5d2c9328-9cf3-4d9a-8f04-6a87b72db36e-kube-api-access-52q85\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:30:18.706864 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.706860 2574 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d2c9328-9cf3-4d9a-8f04-6a87b72db36e-cert\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:30:18.741516 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.741483 2574 generic.go:358] "Generic (PLEG): container finished" podID="5d2c9328-9cf3-4d9a-8f04-6a87b72db36e" containerID="bc98636cc20bd7be0d79230cdf4842de10a7edcc236532b024b7c895ba5f11ae" exitCode=0 Apr 22 19:30:18.741675 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.741543 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-545d8995fb-z88zq" Apr 22 19:30:18.741675 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.741561 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-545d8995fb-z88zq" event={"ID":"5d2c9328-9cf3-4d9a-8f04-6a87b72db36e","Type":"ContainerDied","Data":"bc98636cc20bd7be0d79230cdf4842de10a7edcc236532b024b7c895ba5f11ae"} Apr 22 19:30:18.741675 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.741600 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-545d8995fb-z88zq" event={"ID":"5d2c9328-9cf3-4d9a-8f04-6a87b72db36e","Type":"ContainerDied","Data":"9cb549a7263a889d90a51fbe03d3b9693bd3bf60e5a2932518dccb1a6dcc0a5f"} Apr 22 19:30:18.741675 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.741615 2574 scope.go:117] "RemoveContainer" containerID="bc98636cc20bd7be0d79230cdf4842de10a7edcc236532b024b7c895ba5f11ae" Apr 22 19:30:18.750726 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.750706 2574 scope.go:117] "RemoveContainer" containerID="bc98636cc20bd7be0d79230cdf4842de10a7edcc236532b024b7c895ba5f11ae" Apr 22 19:30:18.751113 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:30:18.751085 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc98636cc20bd7be0d79230cdf4842de10a7edcc236532b024b7c895ba5f11ae\": container with ID starting with bc98636cc20bd7be0d79230cdf4842de10a7edcc236532b024b7c895ba5f11ae not found: ID does not exist" containerID="bc98636cc20bd7be0d79230cdf4842de10a7edcc236532b024b7c895ba5f11ae" Apr 22 19:30:18.751210 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.751123 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc98636cc20bd7be0d79230cdf4842de10a7edcc236532b024b7c895ba5f11ae"} err="failed to get container status \"bc98636cc20bd7be0d79230cdf4842de10a7edcc236532b024b7c895ba5f11ae\": rpc error: code = NotFound desc = could not find container \"bc98636cc20bd7be0d79230cdf4842de10a7edcc236532b024b7c895ba5f11ae\": container with ID starting with bc98636cc20bd7be0d79230cdf4842de10a7edcc236532b024b7c895ba5f11ae not found: ID does not exist" Apr 22 19:30:18.765436 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.765411 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-z88zq"] Apr 22 19:30:18.768181 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.768161 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-z88zq"] Apr 22 19:30:18.794208 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:18.794184 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-w2j7l"] Apr 22 19:30:18.796600 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:30:18.796575 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1f8dc15_45b3_4be9_baf3_9d624a993e5f.slice/crio-9b4524e5996015ea1629544ed6ea18c569c875b92caf1261e5b9c8729a0dd31e WatchSource:0}: Error finding container 9b4524e5996015ea1629544ed6ea18c569c875b92caf1261e5b9c8729a0dd31e: Status 404 returned error can't find the container with id 9b4524e5996015ea1629544ed6ea18c569c875b92caf1261e5b9c8729a0dd31e Apr 22 19:30:19.747341 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:19.747305 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-545d8995fb-w2j7l" event={"ID":"c1f8dc15-45b3-4be9-baf3-9d624a993e5f","Type":"ContainerStarted","Data":"bc728f17c166294a8f556cc626a2abb08ca425a94c0f7915488e8b5b7e2626dc"} Apr 22 19:30:19.747341 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:19.747343 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-545d8995fb-w2j7l" event={"ID":"c1f8dc15-45b3-4be9-baf3-9d624a993e5f","Type":"ContainerStarted","Data":"9b4524e5996015ea1629544ed6ea18c569c875b92caf1261e5b9c8729a0dd31e"} Apr 22 19:30:19.747729 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:19.747441 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-545d8995fb-w2j7l" Apr 22 19:30:19.765355 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:19.765311 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-545d8995fb-w2j7l" podStartSLOduration=1.38836335 podStartE2EDuration="1.765296586s" podCreationTimestamp="2026-04-22 19:30:18 +0000 UTC" firstStartedPulling="2026-04-22 19:30:18.797821936 +0000 UTC m=+410.917201223" lastFinishedPulling="2026-04-22 19:30:19.174755168 +0000 UTC m=+411.294134459" observedRunningTime="2026-04-22 19:30:19.763445492 +0000 UTC m=+411.882824800" watchObservedRunningTime="2026-04-22 19:30:19.765296586 +0000 UTC m=+411.884675894" Apr 22 19:30:20.429773 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:20.429740 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d2c9328-9cf3-4d9a-8f04-6a87b72db36e" path="/var/lib/kubelet/pods/5d2c9328-9cf3-4d9a-8f04-6a87b72db36e/volumes" Apr 22 19:30:34.694660 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:34.694598 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7c688944ff-648w2" podUID="9e4b171f-8770-4cf9-8d75-4d14c8c2a99f" containerName="console" containerID="cri-o://0c0db87b88d328be99872b773d9a2c028ffadedc6996484a02bfde38fbe6e0be" gracePeriod=15 Apr 22 19:30:34.937956 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:34.937934 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c688944ff-648w2_9e4b171f-8770-4cf9-8d75-4d14c8c2a99f/console/0.log" Apr 22 19:30:34.938080 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:34.937992 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:30:35.030438 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.030411 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-trusted-ca-bundle\") pod \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " Apr 22 19:30:35.030438 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.030442 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-console-serving-cert\") pod \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " Apr 22 19:30:35.030685 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.030463 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-console-oauth-config\") pod \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " Apr 22 19:30:35.030685 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.030487 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-oauth-serving-cert\") pod \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " Apr 22 19:30:35.030685 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.030509 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-console-config\") pod \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " Apr 22 19:30:35.030685 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.030556 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbgf6\" (UniqueName: \"kubernetes.io/projected/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-kube-api-access-hbgf6\") pod \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " Apr 22 19:30:35.030685 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.030595 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-service-ca\") pod \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\" (UID: \"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f\") " Apr 22 19:30:35.031056 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.030914 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9e4b171f-8770-4cf9-8d75-4d14c8c2a99f" (UID: "9e4b171f-8770-4cf9-8d75-4d14c8c2a99f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:30:35.031136 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.031030 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9e4b171f-8770-4cf9-8d75-4d14c8c2a99f" (UID: "9e4b171f-8770-4cf9-8d75-4d14c8c2a99f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:30:35.031136 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.031069 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-service-ca" (OuterVolumeSpecName: "service-ca") pod "9e4b171f-8770-4cf9-8d75-4d14c8c2a99f" (UID: "9e4b171f-8770-4cf9-8d75-4d14c8c2a99f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:30:35.031136 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.031067 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-console-config" (OuterVolumeSpecName: "console-config") pod "9e4b171f-8770-4cf9-8d75-4d14c8c2a99f" (UID: "9e4b171f-8770-4cf9-8d75-4d14c8c2a99f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:30:35.032711 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.032674 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-kube-api-access-hbgf6" (OuterVolumeSpecName: "kube-api-access-hbgf6") pod "9e4b171f-8770-4cf9-8d75-4d14c8c2a99f" (UID: "9e4b171f-8770-4cf9-8d75-4d14c8c2a99f"). InnerVolumeSpecName "kube-api-access-hbgf6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:30:35.033082 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.033059 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9e4b171f-8770-4cf9-8d75-4d14c8c2a99f" (UID: "9e4b171f-8770-4cf9-8d75-4d14c8c2a99f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:30:35.033136 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.033068 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9e4b171f-8770-4cf9-8d75-4d14c8c2a99f" (UID: "9e4b171f-8770-4cf9-8d75-4d14c8c2a99f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:30:35.131429 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.131393 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hbgf6\" (UniqueName: \"kubernetes.io/projected/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-kube-api-access-hbgf6\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:30:35.131429 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.131424 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-service-ca\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:30:35.131429 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.131433 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-trusted-ca-bundle\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:30:35.131658 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.131442 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-console-serving-cert\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:30:35.131658 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.131451 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-console-oauth-config\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:30:35.131658 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.131459 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-oauth-serving-cert\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:30:35.131658 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.131468 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f-console-config\") on node \"ip-10-0-129-145.ec2.internal\" DevicePath \"\"" Apr 22 19:30:35.796703 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.796675 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c688944ff-648w2_9e4b171f-8770-4cf9-8d75-4d14c8c2a99f/console/0.log" Apr 22 19:30:35.797191 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.796713 2574 generic.go:358] "Generic (PLEG): container finished" podID="9e4b171f-8770-4cf9-8d75-4d14c8c2a99f" containerID="0c0db87b88d328be99872b773d9a2c028ffadedc6996484a02bfde38fbe6e0be" exitCode=2 Apr 22 19:30:35.797191 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.796748 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c688944ff-648w2" event={"ID":"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f","Type":"ContainerDied","Data":"0c0db87b88d328be99872b773d9a2c028ffadedc6996484a02bfde38fbe6e0be"} Apr 22 19:30:35.797191 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.796775 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c688944ff-648w2" Apr 22 19:30:35.797191 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.796829 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c688944ff-648w2" event={"ID":"9e4b171f-8770-4cf9-8d75-4d14c8c2a99f","Type":"ContainerDied","Data":"39acea018db163e574ab85a43b2e1e894d90db2221fec3da722c55b0a0c5bffd"} Apr 22 19:30:35.797191 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.796854 2574 scope.go:117] "RemoveContainer" containerID="0c0db87b88d328be99872b773d9a2c028ffadedc6996484a02bfde38fbe6e0be" Apr 22 19:30:35.804892 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.804877 2574 scope.go:117] "RemoveContainer" containerID="0c0db87b88d328be99872b773d9a2c028ffadedc6996484a02bfde38fbe6e0be" Apr 22 19:30:35.805123 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:30:35.805104 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c0db87b88d328be99872b773d9a2c028ffadedc6996484a02bfde38fbe6e0be\": container with ID starting with 0c0db87b88d328be99872b773d9a2c028ffadedc6996484a02bfde38fbe6e0be not found: ID does not exist" containerID="0c0db87b88d328be99872b773d9a2c028ffadedc6996484a02bfde38fbe6e0be" Apr 22 19:30:35.805170 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.805141 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c0db87b88d328be99872b773d9a2c028ffadedc6996484a02bfde38fbe6e0be"} err="failed to get container status \"0c0db87b88d328be99872b773d9a2c028ffadedc6996484a02bfde38fbe6e0be\": rpc error: code = NotFound desc = could not find container \"0c0db87b88d328be99872b773d9a2c028ffadedc6996484a02bfde38fbe6e0be\": container with ID starting with 0c0db87b88d328be99872b773d9a2c028ffadedc6996484a02bfde38fbe6e0be not found: ID does not exist" Apr 22 19:30:35.818602 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.818570 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c688944ff-648w2"] Apr 22 19:30:35.824083 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:35.824058 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7c688944ff-648w2"] Apr 22 19:30:36.436101 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:36.436067 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e4b171f-8770-4cf9-8d75-4d14c8c2a99f" path="/var/lib/kubelet/pods/9e4b171f-8770-4cf9-8d75-4d14c8c2a99f/volumes" Apr 22 19:30:50.755367 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:50.755290 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-545d8995fb-w2j7l" Apr 22 19:30:51.771213 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:51.771181 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-pzrzc"] Apr 22 19:30:51.771589 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:51.771477 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e4b171f-8770-4cf9-8d75-4d14c8c2a99f" containerName="console" Apr 22 19:30:51.771589 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:51.771492 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e4b171f-8770-4cf9-8d75-4d14c8c2a99f" containerName="console" Apr 22 19:30:51.771589 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:51.771506 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d2c9328-9cf3-4d9a-8f04-6a87b72db36e" containerName="manager" Apr 22 19:30:51.771589 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:51.771514 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d2c9328-9cf3-4d9a-8f04-6a87b72db36e" containerName="manager" Apr 22 19:30:51.771589 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:51.771580 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d2c9328-9cf3-4d9a-8f04-6a87b72db36e" containerName="manager" Apr 22 19:30:51.771589 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:51.771591 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e4b171f-8770-4cf9-8d75-4d14c8c2a99f" containerName="console" Apr 22 19:30:51.774459 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:51.774443 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-pzrzc" Apr 22 19:30:51.777700 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:51.777675 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 22 19:30:51.778745 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:51.778726 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-rh7jn\"" Apr 22 19:30:51.796532 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:51.796500 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-pzrzc"] Apr 22 19:30:51.859936 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:51.859905 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6708deef-56e8-47d4-a361-88893b47d60c-cert\") pod \"odh-model-controller-696fc77849-pzrzc\" (UID: \"6708deef-56e8-47d4-a361-88893b47d60c\") " pod="kserve/odh-model-controller-696fc77849-pzrzc" Apr 22 19:30:51.860064 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:51.859941 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v4k9\" (UniqueName: \"kubernetes.io/projected/6708deef-56e8-47d4-a361-88893b47d60c-kube-api-access-2v4k9\") pod \"odh-model-controller-696fc77849-pzrzc\" (UID: \"6708deef-56e8-47d4-a361-88893b47d60c\") " pod="kserve/odh-model-controller-696fc77849-pzrzc" Apr 22 19:30:51.961221 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:51.961188 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6708deef-56e8-47d4-a361-88893b47d60c-cert\") pod \"odh-model-controller-696fc77849-pzrzc\" (UID: \"6708deef-56e8-47d4-a361-88893b47d60c\") " pod="kserve/odh-model-controller-696fc77849-pzrzc" Apr 22 19:30:51.961338 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:51.961230 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2v4k9\" (UniqueName: \"kubernetes.io/projected/6708deef-56e8-47d4-a361-88893b47d60c-kube-api-access-2v4k9\") pod \"odh-model-controller-696fc77849-pzrzc\" (UID: \"6708deef-56e8-47d4-a361-88893b47d60c\") " pod="kserve/odh-model-controller-696fc77849-pzrzc" Apr 22 19:30:51.963614 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:51.963597 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6708deef-56e8-47d4-a361-88893b47d60c-cert\") pod \"odh-model-controller-696fc77849-pzrzc\" (UID: \"6708deef-56e8-47d4-a361-88893b47d60c\") " pod="kserve/odh-model-controller-696fc77849-pzrzc" Apr 22 19:30:51.976239 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:51.976208 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v4k9\" (UniqueName: \"kubernetes.io/projected/6708deef-56e8-47d4-a361-88893b47d60c-kube-api-access-2v4k9\") pod \"odh-model-controller-696fc77849-pzrzc\" (UID: \"6708deef-56e8-47d4-a361-88893b47d60c\") " pod="kserve/odh-model-controller-696fc77849-pzrzc" Apr 22 19:30:52.085342 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:52.085260 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-pzrzc" Apr 22 19:30:52.207999 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:52.207967 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-pzrzc"] Apr 22 19:30:52.211497 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:30:52.211462 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6708deef_56e8_47d4_a361_88893b47d60c.slice/crio-b8d61a5063f692a35dd038f3b601e92a97f1405042844db3f95e6df44fae3e1a WatchSource:0}: Error finding container b8d61a5063f692a35dd038f3b601e92a97f1405042844db3f95e6df44fae3e1a: Status 404 returned error can't find the container with id b8d61a5063f692a35dd038f3b601e92a97f1405042844db3f95e6df44fae3e1a Apr 22 19:30:52.851541 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:52.851482 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-pzrzc" event={"ID":"6708deef-56e8-47d4-a361-88893b47d60c","Type":"ContainerStarted","Data":"b8d61a5063f692a35dd038f3b601e92a97f1405042844db3f95e6df44fae3e1a"} Apr 22 19:30:55.862254 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:55.862217 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-pzrzc" event={"ID":"6708deef-56e8-47d4-a361-88893b47d60c","Type":"ContainerStarted","Data":"af9ee07e11b1f54d200e8a3644b76fe94e456ad74281ec3388c95efb6984774b"} Apr 22 19:30:55.862621 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:55.862267 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-pzrzc" Apr 22 19:30:55.896445 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:30:55.896402 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-pzrzc" podStartSLOduration=2.293737887 podStartE2EDuration="4.896388343s" podCreationTimestamp="2026-04-22 19:30:51 +0000 UTC" firstStartedPulling="2026-04-22 19:30:52.212547015 +0000 UTC m=+444.331926302" lastFinishedPulling="2026-04-22 19:30:54.815197466 +0000 UTC m=+446.934576758" observedRunningTime="2026-04-22 19:30:55.895715717 +0000 UTC m=+448.015095039" watchObservedRunningTime="2026-04-22 19:30:55.896388343 +0000 UTC m=+448.015767651" Apr 22 19:31:06.869762 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:31:06.869732 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-pzrzc" Apr 22 19:31:27.218570 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:31:27.218533 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k"] Apr 22 19:31:27.224212 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:31:27.224189 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k" Apr 22 19:31:27.228304 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:31:27.228279 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-5ktbc\"" Apr 22 19:31:27.230344 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:31:27.229630 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k"] Apr 22 19:31:27.240645 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:31:27.240624 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k" Apr 22 19:31:27.414295 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:31:27.414263 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k"] Apr 22 19:31:27.416549 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:31:27.416498 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c8c27be_2412_4c38_b418_a6b4f24b78d9.slice/crio-39efb34e2e71a2ba64ed29332eca814a4ae273e2940f41cf629df638fab28360 WatchSource:0}: Error finding container 39efb34e2e71a2ba64ed29332eca814a4ae273e2940f41cf629df638fab28360: Status 404 returned error can't find the container with id 39efb34e2e71a2ba64ed29332eca814a4ae273e2940f41cf629df638fab28360 Apr 22 19:31:27.963705 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:31:27.963674 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k" event={"ID":"4c8c27be-2412-4c38-b418-a6b4f24b78d9","Type":"ContainerStarted","Data":"39efb34e2e71a2ba64ed29332eca814a4ae273e2940f41cf629df638fab28360"} Apr 22 19:31:42.015974 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:31:42.015935 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k" event={"ID":"4c8c27be-2412-4c38-b418-a6b4f24b78d9","Type":"ContainerStarted","Data":"55a067af5db04ed0399f134ec87ff9f228c1a6f501e4bd5de35a4ef386301ddd"} Apr 22 19:31:42.016393 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:31:42.016111 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k" Apr 22 19:31:42.017483 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:31:42.017457 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k" podUID="4c8c27be-2412-4c38-b418-a6b4f24b78d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 22 19:31:42.032098 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:31:42.032056 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k" podStartSLOduration=0.963098292 podStartE2EDuration="15.032044205s" podCreationTimestamp="2026-04-22 19:31:27 +0000 UTC" firstStartedPulling="2026-04-22 19:31:27.418319013 +0000 UTC m=+479.537698301" lastFinishedPulling="2026-04-22 19:31:41.487264923 +0000 UTC m=+493.606644214" observedRunningTime="2026-04-22 19:31:42.030929827 +0000 UTC m=+494.150309135" watchObservedRunningTime="2026-04-22 19:31:42.032044205 +0000 UTC m=+494.151423513" Apr 22 19:31:43.018897 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:31:43.018859 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k" podUID="4c8c27be-2412-4c38-b418-a6b4f24b78d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 22 19:31:53.019703 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:31:53.019653 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k" podUID="4c8c27be-2412-4c38-b418-a6b4f24b78d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 22 19:32:03.018956 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:32:03.018916 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k" podUID="4c8c27be-2412-4c38-b418-a6b4f24b78d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 22 19:32:13.019722 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:32:13.019678 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k" podUID="4c8c27be-2412-4c38-b418-a6b4f24b78d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 22 19:32:23.019258 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:32:23.019166 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k" podUID="4c8c27be-2412-4c38-b418-a6b4f24b78d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 22 19:32:33.020517 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:32:33.020485 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k" Apr 22 19:33:01.355975 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:01.355904 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k"] Apr 22 19:33:01.356930 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:01.356867 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k" podUID="4c8c27be-2412-4c38-b418-a6b4f24b78d9" containerName="kserve-container" containerID="cri-o://55a067af5db04ed0399f134ec87ff9f228c1a6f501e4bd5de35a4ef386301ddd" gracePeriod=30 Apr 22 19:33:01.381183 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:01.381147 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1be4a-predictor-7d9f6f6594-wlg8z"] Apr 22 19:33:01.384394 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:01.384371 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1be4a-predictor-7d9f6f6594-wlg8z" Apr 22 19:33:01.394448 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:01.394424 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1be4a-predictor-7d9f6f6594-wlg8z"] Apr 22 19:33:01.396857 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:01.396839 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1be4a-predictor-7d9f6f6594-wlg8z" Apr 22 19:33:01.543378 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:01.543348 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1be4a-predictor-7d9f6f6594-wlg8z"] Apr 22 19:33:01.546614 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:33:01.546584 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d88a16b_b00a_4a5e_a612_c6aa262424ce.slice/crio-c7dd63ac01c8f31eab8d7a122c0eb096e3bcc6bc805c3bb1b1c454ddcb6bf2fd WatchSource:0}: Error finding container c7dd63ac01c8f31eab8d7a122c0eb096e3bcc6bc805c3bb1b1c454ddcb6bf2fd: Status 404 returned error can't find the container with id c7dd63ac01c8f31eab8d7a122c0eb096e3bcc6bc805c3bb1b1c454ddcb6bf2fd Apr 22 19:33:02.267585 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:02.267545 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1be4a-predictor-7d9f6f6594-wlg8z" event={"ID":"2d88a16b-b00a-4a5e-a612-c6aa262424ce","Type":"ContainerStarted","Data":"d36e91e85d2cd3afe307dd42e89a5d21d29372a30ea8a5f106176bbcd865caab"} Apr 22 19:33:02.267585 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:02.267585 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1be4a-predictor-7d9f6f6594-wlg8z" event={"ID":"2d88a16b-b00a-4a5e-a612-c6aa262424ce","Type":"ContainerStarted","Data":"c7dd63ac01c8f31eab8d7a122c0eb096e3bcc6bc805c3bb1b1c454ddcb6bf2fd"} Apr 22 19:33:02.267833 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:02.267789 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-1be4a-predictor-7d9f6f6594-wlg8z" Apr 22 19:33:02.269078 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:02.269053 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1be4a-predictor-7d9f6f6594-wlg8z" podUID="2d88a16b-b00a-4a5e-a612-c6aa262424ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 19:33:02.290689 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:02.290644 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-1be4a-predictor-7d9f6f6594-wlg8z" podStartSLOduration=1.290630212 podStartE2EDuration="1.290630212s" podCreationTimestamp="2026-04-22 19:33:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:33:02.289521743 +0000 UTC m=+574.408901052" watchObservedRunningTime="2026-04-22 19:33:02.290630212 +0000 UTC m=+574.410009520" Apr 22 19:33:03.019942 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:03.019901 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k" podUID="4c8c27be-2412-4c38-b418-a6b4f24b78d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 22 19:33:03.271203 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:03.271118 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1be4a-predictor-7d9f6f6594-wlg8z" podUID="2d88a16b-b00a-4a5e-a612-c6aa262424ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 19:33:04.492567 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:04.492543 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k" Apr 22 19:33:05.277522 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:05.277486 2574 generic.go:358] "Generic (PLEG): container finished" podID="4c8c27be-2412-4c38-b418-a6b4f24b78d9" containerID="55a067af5db04ed0399f134ec87ff9f228c1a6f501e4bd5de35a4ef386301ddd" exitCode=0 Apr 22 19:33:05.277701 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:05.277543 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k" Apr 22 19:33:05.277701 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:05.277586 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k" event={"ID":"4c8c27be-2412-4c38-b418-a6b4f24b78d9","Type":"ContainerDied","Data":"55a067af5db04ed0399f134ec87ff9f228c1a6f501e4bd5de35a4ef386301ddd"} Apr 22 19:33:05.277701 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:05.277620 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k" event={"ID":"4c8c27be-2412-4c38-b418-a6b4f24b78d9","Type":"ContainerDied","Data":"39efb34e2e71a2ba64ed29332eca814a4ae273e2940f41cf629df638fab28360"} Apr 22 19:33:05.277701 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:05.277636 2574 scope.go:117] "RemoveContainer" containerID="55a067af5db04ed0399f134ec87ff9f228c1a6f501e4bd5de35a4ef386301ddd" Apr 22 19:33:05.285635 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:05.285616 2574 scope.go:117] "RemoveContainer" containerID="55a067af5db04ed0399f134ec87ff9f228c1a6f501e4bd5de35a4ef386301ddd" Apr 22 19:33:05.285892 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:33:05.285874 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55a067af5db04ed0399f134ec87ff9f228c1a6f501e4bd5de35a4ef386301ddd\": container with ID starting with 55a067af5db04ed0399f134ec87ff9f228c1a6f501e4bd5de35a4ef386301ddd not found: ID does not exist" containerID="55a067af5db04ed0399f134ec87ff9f228c1a6f501e4bd5de35a4ef386301ddd" Apr 22 19:33:05.285960 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:05.285905 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55a067af5db04ed0399f134ec87ff9f228c1a6f501e4bd5de35a4ef386301ddd"} err="failed to get container status \"55a067af5db04ed0399f134ec87ff9f228c1a6f501e4bd5de35a4ef386301ddd\": rpc error: code = NotFound desc = could not find container \"55a067af5db04ed0399f134ec87ff9f228c1a6f501e4bd5de35a4ef386301ddd\": container with ID starting with 55a067af5db04ed0399f134ec87ff9f228c1a6f501e4bd5de35a4ef386301ddd not found: ID does not exist" Apr 22 19:33:05.306251 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:05.306219 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k"] Apr 22 19:33:05.316704 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:05.316682 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b2ca6-predictor-6f7678757b-q5b8k"] Apr 22 19:33:06.429213 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:06.429180 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c8c27be-2412-4c38-b418-a6b4f24b78d9" path="/var/lib/kubelet/pods/4c8c27be-2412-4c38-b418-a6b4f24b78d9/volumes" Apr 22 19:33:13.271325 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:13.271278 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1be4a-predictor-7d9f6f6594-wlg8z" podUID="2d88a16b-b00a-4a5e-a612-c6aa262424ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 19:33:23.271972 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:23.271929 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1be4a-predictor-7d9f6f6594-wlg8z" podUID="2d88a16b-b00a-4a5e-a612-c6aa262424ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 19:33:28.364534 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:28.364506 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-acl-logging/0.log" Apr 22 19:33:28.365474 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:28.365454 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-acl-logging/0.log" Apr 22 19:33:33.272214 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:33.272168 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1be4a-predictor-7d9f6f6594-wlg8z" podUID="2d88a16b-b00a-4a5e-a612-c6aa262424ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 19:33:43.271644 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:43.271592 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1be4a-predictor-7d9f6f6594-wlg8z" podUID="2d88a16b-b00a-4a5e-a612-c6aa262424ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 19:33:47.532497 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:47.532411 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-22a3d-predictor-69444449fb-mvct7"] Apr 22 19:33:47.532865 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:47.532724 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c8c27be-2412-4c38-b418-a6b4f24b78d9" containerName="kserve-container" Apr 22 19:33:47.532865 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:47.532735 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c8c27be-2412-4c38-b418-a6b4f24b78d9" containerName="kserve-container" Apr 22 19:33:47.532865 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:47.532794 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c8c27be-2412-4c38-b418-a6b4f24b78d9" containerName="kserve-container" Apr 22 19:33:47.535560 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:47.535545 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-22a3d-predictor-69444449fb-mvct7" Apr 22 19:33:47.544513 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:47.544485 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-22a3d-predictor-69444449fb-mvct7"] Apr 22 19:33:47.546714 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:47.546692 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-22a3d-predictor-69444449fb-mvct7" Apr 22 19:33:47.683318 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:47.683297 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-22a3d-predictor-69444449fb-mvct7"] Apr 22 19:33:47.686095 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:33:47.686069 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod891c9278_8b52_438f_9afd_f3df893b2826.slice/crio-83b0780cb3b3dc99f9b551f73c191fddb0ca7333e95c2f27c43a840b3319b84f WatchSource:0}: Error finding container 83b0780cb3b3dc99f9b551f73c191fddb0ca7333e95c2f27c43a840b3319b84f: Status 404 returned error can't find the container with id 83b0780cb3b3dc99f9b551f73c191fddb0ca7333e95c2f27c43a840b3319b84f Apr 22 19:33:47.687939 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:47.687920 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:33:48.415538 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:48.415507 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-22a3d-predictor-69444449fb-mvct7" event={"ID":"891c9278-8b52-438f-9afd-f3df893b2826","Type":"ContainerStarted","Data":"092611070d30de59e672e9675877d312db62798da0e918b91f5a2a094f86d5a8"} Apr 22 19:33:48.415538 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:48.415539 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-22a3d-predictor-69444449fb-mvct7" event={"ID":"891c9278-8b52-438f-9afd-f3df893b2826","Type":"ContainerStarted","Data":"83b0780cb3b3dc99f9b551f73c191fddb0ca7333e95c2f27c43a840b3319b84f"} Apr 22 19:33:48.415742 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:48.415717 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-22a3d-predictor-69444449fb-mvct7" Apr 22 19:33:48.417105 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:48.417080 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-22a3d-predictor-69444449fb-mvct7" podUID="891c9278-8b52-438f-9afd-f3df893b2826" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 22 19:33:48.432626 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:48.432574 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-22a3d-predictor-69444449fb-mvct7" podStartSLOduration=1.4325575449999999 podStartE2EDuration="1.432557545s" podCreationTimestamp="2026-04-22 19:33:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:33:48.432077131 +0000 UTC m=+620.551456440" watchObservedRunningTime="2026-04-22 19:33:48.432557545 +0000 UTC m=+620.551936856" Apr 22 19:33:49.418969 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:49.418927 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-22a3d-predictor-69444449fb-mvct7" podUID="891c9278-8b52-438f-9afd-f3df893b2826" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 22 19:33:53.271986 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:53.271950 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-1be4a-predictor-7d9f6f6594-wlg8z" Apr 22 19:33:59.419854 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:33:59.419789 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-22a3d-predictor-69444449fb-mvct7" podUID="891c9278-8b52-438f-9afd-f3df893b2826" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 22 19:34:09.419589 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:34:09.419543 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-22a3d-predictor-69444449fb-mvct7" podUID="891c9278-8b52-438f-9afd-f3df893b2826" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 22 19:34:19.419963 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:34:19.419917 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-22a3d-predictor-69444449fb-mvct7" podUID="891c9278-8b52-438f-9afd-f3df893b2826" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 22 19:34:29.419493 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:34:29.419452 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-22a3d-predictor-69444449fb-mvct7" podUID="891c9278-8b52-438f-9afd-f3df893b2826" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 22 19:34:39.420821 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:34:39.420772 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-22a3d-predictor-69444449fb-mvct7" Apr 22 19:38:28.391409 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:38:28.391314 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-acl-logging/0.log" Apr 22 19:38:28.395830 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:38:28.395783 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-acl-logging/0.log" Apr 22 19:42:26.189386 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:26.189350 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1be4a-predictor-7d9f6f6594-wlg8z"] Apr 22 19:42:26.191823 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:26.189636 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-1be4a-predictor-7d9f6f6594-wlg8z" podUID="2d88a16b-b00a-4a5e-a612-c6aa262424ce" containerName="kserve-container" containerID="cri-o://d36e91e85d2cd3afe307dd42e89a5d21d29372a30ea8a5f106176bbcd865caab" gracePeriod=30 Apr 22 19:42:26.266037 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:26.266002 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn"] Apr 22 19:42:26.269550 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:26.269529 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn" Apr 22 19:42:26.274366 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:26.274339 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn"] Apr 22 19:42:26.281568 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:26.281546 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn" Apr 22 19:42:26.409285 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:26.409204 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn"] Apr 22 19:42:26.412575 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:42:26.412540 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c692a3c_7b5a_406c_a638_c87a2a3d9c19.slice/crio-82f14d25332b673f0244c4cc10f69f0ec7ffa05213a48d355c8b178409ece7f0 WatchSource:0}: Error finding container 82f14d25332b673f0244c4cc10f69f0ec7ffa05213a48d355c8b178409ece7f0: Status 404 returned error can't find the container with id 82f14d25332b673f0244c4cc10f69f0ec7ffa05213a48d355c8b178409ece7f0 Apr 22 19:42:26.414730 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:26.414712 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:42:27.032635 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:27.032596 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn" event={"ID":"3c692a3c-7b5a-406c-a638-c87a2a3d9c19","Type":"ContainerStarted","Data":"6678fe30671fc53a90ff1c5635cf1523a53ba66a0c8ca4433ed819b88a4c8b35"} Apr 22 19:42:27.032854 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:27.032641 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn" event={"ID":"3c692a3c-7b5a-406c-a638-c87a2a3d9c19","Type":"ContainerStarted","Data":"82f14d25332b673f0244c4cc10f69f0ec7ffa05213a48d355c8b178409ece7f0"} Apr 22 19:42:27.032854 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:27.032829 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn" Apr 22 19:42:27.034150 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:27.034118 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn" podUID="3c692a3c-7b5a-406c-a638-c87a2a3d9c19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 19:42:27.047701 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:27.047659 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn" podStartSLOduration=1.047645483 podStartE2EDuration="1.047645483s" podCreationTimestamp="2026-04-22 19:42:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:42:27.046507099 +0000 UTC m=+1139.165886409" watchObservedRunningTime="2026-04-22 19:42:27.047645483 +0000 UTC m=+1139.167024791" Apr 22 19:42:28.035336 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:28.035300 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn" podUID="3c692a3c-7b5a-406c-a638-c87a2a3d9c19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 19:42:29.437036 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:29.437014 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1be4a-predictor-7d9f6f6594-wlg8z" Apr 22 19:42:30.042216 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:30.042180 2574 generic.go:358] "Generic (PLEG): container finished" podID="2d88a16b-b00a-4a5e-a612-c6aa262424ce" containerID="d36e91e85d2cd3afe307dd42e89a5d21d29372a30ea8a5f106176bbcd865caab" exitCode=0 Apr 22 19:42:30.042375 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:30.042234 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1be4a-predictor-7d9f6f6594-wlg8z" event={"ID":"2d88a16b-b00a-4a5e-a612-c6aa262424ce","Type":"ContainerDied","Data":"d36e91e85d2cd3afe307dd42e89a5d21d29372a30ea8a5f106176bbcd865caab"} Apr 22 19:42:30.042375 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:30.042240 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1be4a-predictor-7d9f6f6594-wlg8z" Apr 22 19:42:30.042375 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:30.042260 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1be4a-predictor-7d9f6f6594-wlg8z" event={"ID":"2d88a16b-b00a-4a5e-a612-c6aa262424ce","Type":"ContainerDied","Data":"c7dd63ac01c8f31eab8d7a122c0eb096e3bcc6bc805c3bb1b1c454ddcb6bf2fd"} Apr 22 19:42:30.042375 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:30.042277 2574 scope.go:117] "RemoveContainer" containerID="d36e91e85d2cd3afe307dd42e89a5d21d29372a30ea8a5f106176bbcd865caab" Apr 22 19:42:30.050413 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:30.050389 2574 scope.go:117] "RemoveContainer" containerID="d36e91e85d2cd3afe307dd42e89a5d21d29372a30ea8a5f106176bbcd865caab" Apr 22 19:42:30.050652 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:42:30.050632 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d36e91e85d2cd3afe307dd42e89a5d21d29372a30ea8a5f106176bbcd865caab\": container with ID starting with d36e91e85d2cd3afe307dd42e89a5d21d29372a30ea8a5f106176bbcd865caab not found: ID does not exist" containerID="d36e91e85d2cd3afe307dd42e89a5d21d29372a30ea8a5f106176bbcd865caab" Apr 22 19:42:30.050738 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:30.050660 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d36e91e85d2cd3afe307dd42e89a5d21d29372a30ea8a5f106176bbcd865caab"} err="failed to get container status \"d36e91e85d2cd3afe307dd42e89a5d21d29372a30ea8a5f106176bbcd865caab\": rpc error: code = NotFound desc = could not find container \"d36e91e85d2cd3afe307dd42e89a5d21d29372a30ea8a5f106176bbcd865caab\": container with ID starting with d36e91e85d2cd3afe307dd42e89a5d21d29372a30ea8a5f106176bbcd865caab not found: ID does not exist" Apr 22 19:42:30.062839 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:30.062814 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1be4a-predictor-7d9f6f6594-wlg8z"] Apr 22 19:42:30.065964 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:30.065945 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1be4a-predictor-7d9f6f6594-wlg8z"] Apr 22 19:42:30.430087 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:30.430014 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d88a16b-b00a-4a5e-a612-c6aa262424ce" path="/var/lib/kubelet/pods/2d88a16b-b00a-4a5e-a612-c6aa262424ce/volumes" Apr 22 19:42:38.036157 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:38.036115 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn" podUID="3c692a3c-7b5a-406c-a638-c87a2a3d9c19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 19:42:48.035850 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:48.035731 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn" podUID="3c692a3c-7b5a-406c-a638-c87a2a3d9c19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 19:42:58.035774 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:42:58.035731 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn" podUID="3c692a3c-7b5a-406c-a638-c87a2a3d9c19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 19:43:08.036108 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:08.036065 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn" podUID="3c692a3c-7b5a-406c-a638-c87a2a3d9c19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 19:43:12.322887 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:12.322845 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-22a3d-predictor-69444449fb-mvct7"] Apr 22 19:43:12.323269 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:12.323135 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-22a3d-predictor-69444449fb-mvct7" podUID="891c9278-8b52-438f-9afd-f3df893b2826" containerName="kserve-container" containerID="cri-o://092611070d30de59e672e9675877d312db62798da0e918b91f5a2a094f86d5a8" gracePeriod=30 Apr 22 19:43:12.352894 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:12.352858 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb"] Apr 22 19:43:12.353337 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:12.353321 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d88a16b-b00a-4a5e-a612-c6aa262424ce" containerName="kserve-container" Apr 22 19:43:12.353398 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:12.353341 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d88a16b-b00a-4a5e-a612-c6aa262424ce" containerName="kserve-container" Apr 22 19:43:12.353451 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:12.353418 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d88a16b-b00a-4a5e-a612-c6aa262424ce" containerName="kserve-container" Apr 22 19:43:12.356589 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:12.356568 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb" Apr 22 19:43:12.366003 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:12.365976 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb"] Apr 22 19:43:12.367653 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:12.367631 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb" Apr 22 19:43:12.704627 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:12.704542 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb"] Apr 22 19:43:12.707336 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:43:12.707308 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8afd4c3b_c1d1_41b7_baeb_5bcce37dff34.slice/crio-9a2b61e3e98aa1bb661e4b299288cb18243069e89276f4b0d1697bd3fe8e63e6 WatchSource:0}: Error finding container 9a2b61e3e98aa1bb661e4b299288cb18243069e89276f4b0d1697bd3fe8e63e6: Status 404 returned error can't find the container with id 9a2b61e3e98aa1bb661e4b299288cb18243069e89276f4b0d1697bd3fe8e63e6 Apr 22 19:43:13.190762 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:13.190725 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb" event={"ID":"8afd4c3b-c1d1-41b7-baeb-5bcce37dff34","Type":"ContainerStarted","Data":"0935ccfe9b80a7633c168dc1415448cec113bb4094ebb43f9f0e61e7218ec840"} Apr 22 19:43:13.190762 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:13.190765 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb" event={"ID":"8afd4c3b-c1d1-41b7-baeb-5bcce37dff34","Type":"ContainerStarted","Data":"9a2b61e3e98aa1bb661e4b299288cb18243069e89276f4b0d1697bd3fe8e63e6"} Apr 22 19:43:13.191046 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:13.190930 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb" Apr 22 19:43:13.192182 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:13.192158 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb" podUID="8afd4c3b-c1d1-41b7-baeb-5bcce37dff34" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:43:13.207407 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:13.207358 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb" podStartSLOduration=1.207335498 podStartE2EDuration="1.207335498s" podCreationTimestamp="2026-04-22 19:43:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:43:13.20564586 +0000 UTC m=+1185.325025167" watchObservedRunningTime="2026-04-22 19:43:13.207335498 +0000 UTC m=+1185.326714806" Apr 22 19:43:14.195055 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:14.195011 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb" podUID="8afd4c3b-c1d1-41b7-baeb-5bcce37dff34" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:43:15.564628 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:15.564602 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-22a3d-predictor-69444449fb-mvct7" Apr 22 19:43:16.202689 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:16.202651 2574 generic.go:358] "Generic (PLEG): container finished" podID="891c9278-8b52-438f-9afd-f3df893b2826" containerID="092611070d30de59e672e9675877d312db62798da0e918b91f5a2a094f86d5a8" exitCode=0 Apr 22 19:43:16.202909 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:16.202711 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-22a3d-predictor-69444449fb-mvct7" Apr 22 19:43:16.202909 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:16.202736 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-22a3d-predictor-69444449fb-mvct7" event={"ID":"891c9278-8b52-438f-9afd-f3df893b2826","Type":"ContainerDied","Data":"092611070d30de59e672e9675877d312db62798da0e918b91f5a2a094f86d5a8"} Apr 22 19:43:16.202909 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:16.202784 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-22a3d-predictor-69444449fb-mvct7" event={"ID":"891c9278-8b52-438f-9afd-f3df893b2826","Type":"ContainerDied","Data":"83b0780cb3b3dc99f9b551f73c191fddb0ca7333e95c2f27c43a840b3319b84f"} Apr 22 19:43:16.202909 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:16.202827 2574 scope.go:117] "RemoveContainer" containerID="092611070d30de59e672e9675877d312db62798da0e918b91f5a2a094f86d5a8" Apr 22 19:43:16.211023 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:16.211003 2574 scope.go:117] "RemoveContainer" containerID="092611070d30de59e672e9675877d312db62798da0e918b91f5a2a094f86d5a8" Apr 22 19:43:16.211261 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:43:16.211243 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"092611070d30de59e672e9675877d312db62798da0e918b91f5a2a094f86d5a8\": container with ID starting with 092611070d30de59e672e9675877d312db62798da0e918b91f5a2a094f86d5a8 not found: ID does not exist" containerID="092611070d30de59e672e9675877d312db62798da0e918b91f5a2a094f86d5a8" Apr 22 19:43:16.211319 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:16.211271 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"092611070d30de59e672e9675877d312db62798da0e918b91f5a2a094f86d5a8"} err="failed to get container status \"092611070d30de59e672e9675877d312db62798da0e918b91f5a2a094f86d5a8\": rpc error: code = NotFound desc = could not find container \"092611070d30de59e672e9675877d312db62798da0e918b91f5a2a094f86d5a8\": container with ID starting with 092611070d30de59e672e9675877d312db62798da0e918b91f5a2a094f86d5a8 not found: ID does not exist" Apr 22 19:43:16.223541 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:16.223514 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-22a3d-predictor-69444449fb-mvct7"] Apr 22 19:43:16.227095 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:16.227073 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-22a3d-predictor-69444449fb-mvct7"] Apr 22 19:43:16.430742 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:16.430703 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="891c9278-8b52-438f-9afd-f3df893b2826" path="/var/lib/kubelet/pods/891c9278-8b52-438f-9afd-f3df893b2826/volumes" Apr 22 19:43:18.037445 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:18.037414 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn" Apr 22 19:43:24.195064 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:24.195022 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb" podUID="8afd4c3b-c1d1-41b7-baeb-5bcce37dff34" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:43:28.411587 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:28.411558 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-acl-logging/0.log" Apr 22 19:43:28.416355 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:28.416335 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-acl-logging/0.log" Apr 22 19:43:34.195186 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:34.195140 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb" podUID="8afd4c3b-c1d1-41b7-baeb-5bcce37dff34" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:43:44.195727 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:44.195676 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb" podUID="8afd4c3b-c1d1-41b7-baeb-5bcce37dff34" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:43:46.533093 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:46.533058 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn"] Apr 22 19:43:46.533536 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:46.533286 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn" podUID="3c692a3c-7b5a-406c-a638-c87a2a3d9c19" containerName="kserve-container" containerID="cri-o://6678fe30671fc53a90ff1c5635cf1523a53ba66a0c8ca4433ed819b88a4c8b35" gracePeriod=30 Apr 22 19:43:46.606777 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:46.606741 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-62743-predictor-54cbd885cb-cz89z"] Apr 22 19:43:46.607235 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:46.607220 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="891c9278-8b52-438f-9afd-f3df893b2826" containerName="kserve-container" Apr 22 19:43:46.607283 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:46.607238 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="891c9278-8b52-438f-9afd-f3df893b2826" containerName="kserve-container" Apr 22 19:43:46.607323 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:46.607314 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="891c9278-8b52-438f-9afd-f3df893b2826" containerName="kserve-container" Apr 22 19:43:46.611929 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:46.611910 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-62743-predictor-54cbd885cb-cz89z" Apr 22 19:43:46.619554 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:46.619253 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-62743-predictor-54cbd885cb-cz89z"] Apr 22 19:43:46.623419 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:46.623395 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-62743-predictor-54cbd885cb-cz89z" Apr 22 19:43:46.755471 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:46.755438 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-62743-predictor-54cbd885cb-cz89z"] Apr 22 19:43:46.758668 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:43:46.758637 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd16a84c8_15e0_40f5_8a21_70ae10ee4b53.slice/crio-22855985facf21a802393dda938c0844adf7d38998c2ff99f0001238290c6ef1 WatchSource:0}: Error finding container 22855985facf21a802393dda938c0844adf7d38998c2ff99f0001238290c6ef1: Status 404 returned error can't find the container with id 22855985facf21a802393dda938c0844adf7d38998c2ff99f0001238290c6ef1 Apr 22 19:43:47.307383 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:47.307332 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-62743-predictor-54cbd885cb-cz89z" event={"ID":"d16a84c8-15e0-40f5-8a21-70ae10ee4b53","Type":"ContainerStarted","Data":"45a34ea550ea4261bb66172488827e4aa2347a31be4b9ab8bf2d0dad3d886592"} Apr 22 19:43:47.307383 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:47.307385 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-62743-predictor-54cbd885cb-cz89z" event={"ID":"d16a84c8-15e0-40f5-8a21-70ae10ee4b53","Type":"ContainerStarted","Data":"22855985facf21a802393dda938c0844adf7d38998c2ff99f0001238290c6ef1"} Apr 22 19:43:47.307603 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:47.307491 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-62743-predictor-54cbd885cb-cz89z" Apr 22 19:43:47.308751 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:47.308722 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-62743-predictor-54cbd885cb-cz89z" podUID="d16a84c8-15e0-40f5-8a21-70ae10ee4b53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 19:43:47.340285 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:47.340229 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-62743-predictor-54cbd885cb-cz89z" podStartSLOduration=1.340216459 podStartE2EDuration="1.340216459s" podCreationTimestamp="2026-04-22 19:43:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:43:47.339107473 +0000 UTC m=+1219.458486781" watchObservedRunningTime="2026-04-22 19:43:47.340216459 +0000 UTC m=+1219.459595768" Apr 22 19:43:48.035780 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:48.035738 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn" podUID="3c692a3c-7b5a-406c-a638-c87a2a3d9c19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 19:43:48.310480 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:48.310384 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-62743-predictor-54cbd885cb-cz89z" podUID="d16a84c8-15e0-40f5-8a21-70ae10ee4b53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 19:43:49.678755 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:49.678735 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn" Apr 22 19:43:50.318084 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:50.318046 2574 generic.go:358] "Generic (PLEG): container finished" podID="3c692a3c-7b5a-406c-a638-c87a2a3d9c19" containerID="6678fe30671fc53a90ff1c5635cf1523a53ba66a0c8ca4433ed819b88a4c8b35" exitCode=0 Apr 22 19:43:50.318251 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:50.318089 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn" event={"ID":"3c692a3c-7b5a-406c-a638-c87a2a3d9c19","Type":"ContainerDied","Data":"6678fe30671fc53a90ff1c5635cf1523a53ba66a0c8ca4433ed819b88a4c8b35"} Apr 22 19:43:50.318251 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:50.318103 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn" Apr 22 19:43:50.318251 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:50.318112 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn" event={"ID":"3c692a3c-7b5a-406c-a638-c87a2a3d9c19","Type":"ContainerDied","Data":"82f14d25332b673f0244c4cc10f69f0ec7ffa05213a48d355c8b178409ece7f0"} Apr 22 19:43:50.318251 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:50.318131 2574 scope.go:117] "RemoveContainer" containerID="6678fe30671fc53a90ff1c5635cf1523a53ba66a0c8ca4433ed819b88a4c8b35" Apr 22 19:43:50.326098 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:50.326075 2574 scope.go:117] "RemoveContainer" containerID="6678fe30671fc53a90ff1c5635cf1523a53ba66a0c8ca4433ed819b88a4c8b35" Apr 22 19:43:50.326344 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:43:50.326325 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6678fe30671fc53a90ff1c5635cf1523a53ba66a0c8ca4433ed819b88a4c8b35\": container with ID starting with 6678fe30671fc53a90ff1c5635cf1523a53ba66a0c8ca4433ed819b88a4c8b35 not found: ID does not exist" containerID="6678fe30671fc53a90ff1c5635cf1523a53ba66a0c8ca4433ed819b88a4c8b35" Apr 22 19:43:50.326397 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:50.326352 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6678fe30671fc53a90ff1c5635cf1523a53ba66a0c8ca4433ed819b88a4c8b35"} err="failed to get container status \"6678fe30671fc53a90ff1c5635cf1523a53ba66a0c8ca4433ed819b88a4c8b35\": rpc error: code = NotFound desc = could not find container \"6678fe30671fc53a90ff1c5635cf1523a53ba66a0c8ca4433ed819b88a4c8b35\": container with ID starting with 6678fe30671fc53a90ff1c5635cf1523a53ba66a0c8ca4433ed819b88a4c8b35 not found: ID does not exist" Apr 22 19:43:50.338589 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:50.338563 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn"] Apr 22 19:43:50.342668 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:50.342648 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0213e-predictor-6bb556f566-44hqn"] Apr 22 19:43:50.429189 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:50.429156 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c692a3c-7b5a-406c-a638-c87a2a3d9c19" path="/var/lib/kubelet/pods/3c692a3c-7b5a-406c-a638-c87a2a3d9c19/volumes" Apr 22 19:43:54.195133 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:54.195089 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb" podUID="8afd4c3b-c1d1-41b7-baeb-5bcce37dff34" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:43:58.311146 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:43:58.311102 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-62743-predictor-54cbd885cb-cz89z" podUID="d16a84c8-15e0-40f5-8a21-70ae10ee4b53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 19:44:04.196531 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:04.196498 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb" Apr 22 19:44:08.311122 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:08.311081 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-62743-predictor-54cbd885cb-cz89z" podUID="d16a84c8-15e0-40f5-8a21-70ae10ee4b53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 19:44:18.311023 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:18.310928 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-62743-predictor-54cbd885cb-cz89z" podUID="d16a84c8-15e0-40f5-8a21-70ae10ee4b53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 19:44:28.310483 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:28.310443 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-62743-predictor-54cbd885cb-cz89z" podUID="d16a84c8-15e0-40f5-8a21-70ae10ee4b53" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 19:44:32.563385 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:32.563330 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb"] Apr 22 19:44:32.563834 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:32.563669 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb" podUID="8afd4c3b-c1d1-41b7-baeb-5bcce37dff34" containerName="kserve-container" containerID="cri-o://0935ccfe9b80a7633c168dc1415448cec113bb4094ebb43f9f0e61e7218ec840" gracePeriod=30 Apr 22 19:44:32.595758 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:32.595729 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7d5c8-predictor-595fbbfc94-q5ghv"] Apr 22 19:44:32.596065 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:32.596052 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c692a3c-7b5a-406c-a638-c87a2a3d9c19" containerName="kserve-container" Apr 22 19:44:32.596119 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:32.596066 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c692a3c-7b5a-406c-a638-c87a2a3d9c19" containerName="kserve-container" Apr 22 19:44:32.596174 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:32.596126 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c692a3c-7b5a-406c-a638-c87a2a3d9c19" containerName="kserve-container" Apr 22 19:44:32.599370 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:32.599350 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7d5c8-predictor-595fbbfc94-q5ghv" Apr 22 19:44:32.607835 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:32.607303 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7d5c8-predictor-595fbbfc94-q5ghv"] Apr 22 19:44:32.612073 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:32.612049 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7d5c8-predictor-595fbbfc94-q5ghv" Apr 22 19:44:32.752674 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:32.752642 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7d5c8-predictor-595fbbfc94-q5ghv"] Apr 22 19:44:32.755042 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:44:32.755012 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod611955a8_9b65_46d1_a592_5a176628ebed.slice/crio-53055eb0bc6a2fcbfe01b812f95471222a46c3606b97286a58fee11909ebb9c7 WatchSource:0}: Error finding container 53055eb0bc6a2fcbfe01b812f95471222a46c3606b97286a58fee11909ebb9c7: Status 404 returned error can't find the container with id 53055eb0bc6a2fcbfe01b812f95471222a46c3606b97286a58fee11909ebb9c7 Apr 22 19:44:33.454332 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:33.454296 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7d5c8-predictor-595fbbfc94-q5ghv" event={"ID":"611955a8-9b65-46d1-a592-5a176628ebed","Type":"ContainerStarted","Data":"b1fd22f8ba216638f87ea2a3c01cd9bc0d9618875b20326b27c5359e73355b96"} Apr 22 19:44:33.454513 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:33.454338 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7d5c8-predictor-595fbbfc94-q5ghv" event={"ID":"611955a8-9b65-46d1-a592-5a176628ebed","Type":"ContainerStarted","Data":"53055eb0bc6a2fcbfe01b812f95471222a46c3606b97286a58fee11909ebb9c7"} Apr 22 19:44:33.454556 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:33.454517 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-7d5c8-predictor-595fbbfc94-q5ghv" Apr 22 19:44:33.455696 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:33.455672 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7d5c8-predictor-595fbbfc94-q5ghv" podUID="611955a8-9b65-46d1-a592-5a176628ebed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 19:44:33.469839 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:33.469780 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-7d5c8-predictor-595fbbfc94-q5ghv" podStartSLOduration=1.469768999 podStartE2EDuration="1.469768999s" podCreationTimestamp="2026-04-22 19:44:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:44:33.46879365 +0000 UTC m=+1265.588172982" watchObservedRunningTime="2026-04-22 19:44:33.469768999 +0000 UTC m=+1265.589148309" Apr 22 19:44:34.195563 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:34.195528 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb" podUID="8afd4c3b-c1d1-41b7-baeb-5bcce37dff34" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:44:34.457400 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:34.457301 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7d5c8-predictor-595fbbfc94-q5ghv" podUID="611955a8-9b65-46d1-a592-5a176628ebed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 19:44:35.714860 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:35.714834 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb" Apr 22 19:44:36.465385 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:36.465352 2574 generic.go:358] "Generic (PLEG): container finished" podID="8afd4c3b-c1d1-41b7-baeb-5bcce37dff34" containerID="0935ccfe9b80a7633c168dc1415448cec113bb4094ebb43f9f0e61e7218ec840" exitCode=0 Apr 22 19:44:36.465533 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:36.465404 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb" event={"ID":"8afd4c3b-c1d1-41b7-baeb-5bcce37dff34","Type":"ContainerDied","Data":"0935ccfe9b80a7633c168dc1415448cec113bb4094ebb43f9f0e61e7218ec840"} Apr 22 19:44:36.465533 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:36.465417 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb" Apr 22 19:44:36.465533 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:36.465435 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb" event={"ID":"8afd4c3b-c1d1-41b7-baeb-5bcce37dff34","Type":"ContainerDied","Data":"9a2b61e3e98aa1bb661e4b299288cb18243069e89276f4b0d1697bd3fe8e63e6"} Apr 22 19:44:36.465533 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:36.465451 2574 scope.go:117] "RemoveContainer" containerID="0935ccfe9b80a7633c168dc1415448cec113bb4094ebb43f9f0e61e7218ec840" Apr 22 19:44:36.473047 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:36.473020 2574 scope.go:117] "RemoveContainer" containerID="0935ccfe9b80a7633c168dc1415448cec113bb4094ebb43f9f0e61e7218ec840" Apr 22 19:44:36.473302 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:44:36.473283 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0935ccfe9b80a7633c168dc1415448cec113bb4094ebb43f9f0e61e7218ec840\": container with ID starting with 0935ccfe9b80a7633c168dc1415448cec113bb4094ebb43f9f0e61e7218ec840 not found: ID does not exist" containerID="0935ccfe9b80a7633c168dc1415448cec113bb4094ebb43f9f0e61e7218ec840" Apr 22 19:44:36.473363 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:36.473311 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0935ccfe9b80a7633c168dc1415448cec113bb4094ebb43f9f0e61e7218ec840"} err="failed to get container status \"0935ccfe9b80a7633c168dc1415448cec113bb4094ebb43f9f0e61e7218ec840\": rpc error: code = NotFound desc = could not find container \"0935ccfe9b80a7633c168dc1415448cec113bb4094ebb43f9f0e61e7218ec840\": container with ID starting with 0935ccfe9b80a7633c168dc1415448cec113bb4094ebb43f9f0e61e7218ec840 not found: ID does not exist" Apr 22 19:44:36.484879 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:36.484855 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb"] Apr 22 19:44:36.488626 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:36.488603 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3becf-predictor-7846f6d998-8q6cb"] Apr 22 19:44:38.311993 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:38.311956 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-62743-predictor-54cbd885cb-cz89z" Apr 22 19:44:38.428774 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:38.428743 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8afd4c3b-c1d1-41b7-baeb-5bcce37dff34" path="/var/lib/kubelet/pods/8afd4c3b-c1d1-41b7-baeb-5bcce37dff34/volumes" Apr 22 19:44:44.458409 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:44.458359 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7d5c8-predictor-595fbbfc94-q5ghv" podUID="611955a8-9b65-46d1-a592-5a176628ebed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 19:44:54.457453 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:44:54.457407 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7d5c8-predictor-595fbbfc94-q5ghv" podUID="611955a8-9b65-46d1-a592-5a176628ebed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 19:45:04.458288 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:45:04.458246 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7d5c8-predictor-595fbbfc94-q5ghv" podUID="611955a8-9b65-46d1-a592-5a176628ebed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 19:45:14.458127 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:45:14.458084 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7d5c8-predictor-595fbbfc94-q5ghv" podUID="611955a8-9b65-46d1-a592-5a176628ebed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 22 19:45:24.459526 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:45:24.459494 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-7d5c8-predictor-595fbbfc94-q5ghv" Apr 22 19:48:28.432305 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:48:28.432278 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-acl-logging/0.log" Apr 22 19:48:28.438040 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:48:28.438020 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-acl-logging/0.log" Apr 22 19:53:11.472894 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:11.472856 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-62743-predictor-54cbd885cb-cz89z"] Apr 22 19:53:11.475587 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:11.473367 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-62743-predictor-54cbd885cb-cz89z" podUID="d16a84c8-15e0-40f5-8a21-70ae10ee4b53" containerName="kserve-container" containerID="cri-o://45a34ea550ea4261bb66172488827e4aa2347a31be4b9ab8bf2d0dad3d886592" gracePeriod=30 Apr 22 19:53:11.562545 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:11.562507 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk"] Apr 22 19:53:11.562871 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:11.562854 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8afd4c3b-c1d1-41b7-baeb-5bcce37dff34" containerName="kserve-container" Apr 22 19:53:11.562871 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:11.562872 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8afd4c3b-c1d1-41b7-baeb-5bcce37dff34" containerName="kserve-container" Apr 22 19:53:11.562974 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:11.562932 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="8afd4c3b-c1d1-41b7-baeb-5bcce37dff34" containerName="kserve-container" Apr 22 19:53:11.565820 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:11.565787 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk" Apr 22 19:53:11.571823 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:11.571782 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk"] Apr 22 19:53:11.577027 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:11.577008 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk" Apr 22 19:53:11.706255 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:11.706223 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk"] Apr 22 19:53:11.709535 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:53:11.709504 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2d2887f_0262_4abc_a637_10a452c2131b.slice/crio-266c4975ea1776aea5d22499942a1bd6bb0ba7e347153e16438e743ca6e5b683 WatchSource:0}: Error finding container 266c4975ea1776aea5d22499942a1bd6bb0ba7e347153e16438e743ca6e5b683: Status 404 returned error can't find the container with id 266c4975ea1776aea5d22499942a1bd6bb0ba7e347153e16438e743ca6e5b683 Apr 22 19:53:11.711686 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:11.711664 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:53:12.090595 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:12.090563 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk" event={"ID":"a2d2887f-0262-4abc-a637-10a452c2131b","Type":"ContainerStarted","Data":"6849d3dd6aab5d92e7e774da24592b68c06dd9ebd7412b243bcac0f263328225"} Apr 22 19:53:12.090595 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:12.090598 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk" event={"ID":"a2d2887f-0262-4abc-a637-10a452c2131b","Type":"ContainerStarted","Data":"266c4975ea1776aea5d22499942a1bd6bb0ba7e347153e16438e743ca6e5b683"} Apr 22 19:53:12.090828 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:12.090795 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk" Apr 22 19:53:12.091889 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:12.091865 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk" podUID="a2d2887f-0262-4abc-a637-10a452c2131b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:53:12.108221 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:12.108167 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk" podStartSLOduration=1.108149602 podStartE2EDuration="1.108149602s" podCreationTimestamp="2026-04-22 19:53:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:53:12.106458177 +0000 UTC m=+1784.225837486" watchObservedRunningTime="2026-04-22 19:53:12.108149602 +0000 UTC m=+1784.227528912" Apr 22 19:53:13.094602 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:13.094559 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk" podUID="a2d2887f-0262-4abc-a637-10a452c2131b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:53:14.815460 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:14.815439 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-62743-predictor-54cbd885cb-cz89z" Apr 22 19:53:15.103445 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:15.103403 2574 generic.go:358] "Generic (PLEG): container finished" podID="d16a84c8-15e0-40f5-8a21-70ae10ee4b53" containerID="45a34ea550ea4261bb66172488827e4aa2347a31be4b9ab8bf2d0dad3d886592" exitCode=0 Apr 22 19:53:15.103650 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:15.103499 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-62743-predictor-54cbd885cb-cz89z" Apr 22 19:53:15.103650 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:15.103493 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-62743-predictor-54cbd885cb-cz89z" event={"ID":"d16a84c8-15e0-40f5-8a21-70ae10ee4b53","Type":"ContainerDied","Data":"45a34ea550ea4261bb66172488827e4aa2347a31be4b9ab8bf2d0dad3d886592"} Apr 22 19:53:15.103650 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:15.103543 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-62743-predictor-54cbd885cb-cz89z" event={"ID":"d16a84c8-15e0-40f5-8a21-70ae10ee4b53","Type":"ContainerDied","Data":"22855985facf21a802393dda938c0844adf7d38998c2ff99f0001238290c6ef1"} Apr 22 19:53:15.103650 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:15.103563 2574 scope.go:117] "RemoveContainer" containerID="45a34ea550ea4261bb66172488827e4aa2347a31be4b9ab8bf2d0dad3d886592" Apr 22 19:53:15.111828 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:15.111784 2574 scope.go:117] "RemoveContainer" containerID="45a34ea550ea4261bb66172488827e4aa2347a31be4b9ab8bf2d0dad3d886592" Apr 22 19:53:15.112239 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:53:15.112220 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45a34ea550ea4261bb66172488827e4aa2347a31be4b9ab8bf2d0dad3d886592\": container with ID starting with 45a34ea550ea4261bb66172488827e4aa2347a31be4b9ab8bf2d0dad3d886592 not found: ID does not exist" containerID="45a34ea550ea4261bb66172488827e4aa2347a31be4b9ab8bf2d0dad3d886592" Apr 22 19:53:15.112307 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:15.112246 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45a34ea550ea4261bb66172488827e4aa2347a31be4b9ab8bf2d0dad3d886592"} err="failed to get container status \"45a34ea550ea4261bb66172488827e4aa2347a31be4b9ab8bf2d0dad3d886592\": rpc error: code = NotFound desc = could not find container \"45a34ea550ea4261bb66172488827e4aa2347a31be4b9ab8bf2d0dad3d886592\": container with ID starting with 45a34ea550ea4261bb66172488827e4aa2347a31be4b9ab8bf2d0dad3d886592 not found: ID does not exist" Apr 22 19:53:15.124251 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:15.124223 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-62743-predictor-54cbd885cb-cz89z"] Apr 22 19:53:15.127679 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:15.127657 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-62743-predictor-54cbd885cb-cz89z"] Apr 22 19:53:16.429246 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:16.429213 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d16a84c8-15e0-40f5-8a21-70ae10ee4b53" path="/var/lib/kubelet/pods/d16a84c8-15e0-40f5-8a21-70ae10ee4b53/volumes" Apr 22 19:53:23.095652 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:23.095615 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk" podUID="a2d2887f-0262-4abc-a637-10a452c2131b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:53:28.452977 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:28.452944 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-acl-logging/0.log" Apr 22 19:53:28.458607 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:28.458584 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-acl-logging/0.log" Apr 22 19:53:33.095426 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:33.095378 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk" podUID="a2d2887f-0262-4abc-a637-10a452c2131b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:53:43.095251 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:43.095209 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk" podUID="a2d2887f-0262-4abc-a637-10a452c2131b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:53:53.095045 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:53.095004 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk" podUID="a2d2887f-0262-4abc-a637-10a452c2131b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:53:57.441317 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:57.441267 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7d5c8-predictor-595fbbfc94-q5ghv"] Apr 22 19:53:57.441759 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:57.441573 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-7d5c8-predictor-595fbbfc94-q5ghv" podUID="611955a8-9b65-46d1-a592-5a176628ebed" containerName="kserve-container" containerID="cri-o://b1fd22f8ba216638f87ea2a3c01cd9bc0d9618875b20326b27c5359e73355b96" gracePeriod=30 Apr 22 19:53:57.503068 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:57.503031 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4"] Apr 22 19:53:57.503374 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:57.503362 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d16a84c8-15e0-40f5-8a21-70ae10ee4b53" containerName="kserve-container" Apr 22 19:53:57.503418 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:57.503375 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d16a84c8-15e0-40f5-8a21-70ae10ee4b53" containerName="kserve-container" Apr 22 19:53:57.503458 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:57.503423 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d16a84c8-15e0-40f5-8a21-70ae10ee4b53" containerName="kserve-container" Apr 22 19:53:57.506351 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:57.506331 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4" Apr 22 19:53:57.513328 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:57.513305 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4"] Apr 22 19:53:57.517243 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:57.517210 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4" Apr 22 19:53:57.659306 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:57.659173 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4"] Apr 22 19:53:57.662086 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:53:57.662054 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37726911_fdc0_4a6e_86e4_4aba23aa46d1.slice/crio-b446184c24aef2f1595b5d0f7d98544cbf857289775f174696426994c38165f7 WatchSource:0}: Error finding container b446184c24aef2f1595b5d0f7d98544cbf857289775f174696426994c38165f7: Status 404 returned error can't find the container with id b446184c24aef2f1595b5d0f7d98544cbf857289775f174696426994c38165f7 Apr 22 19:53:58.242723 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:58.242690 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4" event={"ID":"37726911-fdc0-4a6e-86e4-4aba23aa46d1","Type":"ContainerStarted","Data":"3a20211e65424d7a77f65fc50fc6298058bf2a467a85dc4cc1188cb7e9446745"} Apr 22 19:53:58.242949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:58.242729 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4" event={"ID":"37726911-fdc0-4a6e-86e4-4aba23aa46d1","Type":"ContainerStarted","Data":"b446184c24aef2f1595b5d0f7d98544cbf857289775f174696426994c38165f7"} Apr 22 19:53:58.242949 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:58.242860 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4" Apr 22 19:53:58.244308 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:58.244275 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4" podUID="37726911-fdc0-4a6e-86e4-4aba23aa46d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 19:53:58.256961 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:58.256913 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4" podStartSLOduration=1.25690096 podStartE2EDuration="1.25690096s" podCreationTimestamp="2026-04-22 19:53:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:53:58.256253506 +0000 UTC m=+1830.375632830" watchObservedRunningTime="2026-04-22 19:53:58.25690096 +0000 UTC m=+1830.376280269" Apr 22 19:53:59.246321 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:53:59.246284 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4" podUID="37726911-fdc0-4a6e-86e4-4aba23aa46d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 19:54:00.691053 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:00.691027 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7d5c8-predictor-595fbbfc94-q5ghv" Apr 22 19:54:01.252776 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:01.252742 2574 generic.go:358] "Generic (PLEG): container finished" podID="611955a8-9b65-46d1-a592-5a176628ebed" containerID="b1fd22f8ba216638f87ea2a3c01cd9bc0d9618875b20326b27c5359e73355b96" exitCode=0 Apr 22 19:54:01.252966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:01.252823 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7d5c8-predictor-595fbbfc94-q5ghv" Apr 22 19:54:01.252966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:01.252822 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7d5c8-predictor-595fbbfc94-q5ghv" event={"ID":"611955a8-9b65-46d1-a592-5a176628ebed","Type":"ContainerDied","Data":"b1fd22f8ba216638f87ea2a3c01cd9bc0d9618875b20326b27c5359e73355b96"} Apr 22 19:54:01.252966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:01.252930 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7d5c8-predictor-595fbbfc94-q5ghv" event={"ID":"611955a8-9b65-46d1-a592-5a176628ebed","Type":"ContainerDied","Data":"53055eb0bc6a2fcbfe01b812f95471222a46c3606b97286a58fee11909ebb9c7"} Apr 22 19:54:01.252966 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:01.252948 2574 scope.go:117] "RemoveContainer" containerID="b1fd22f8ba216638f87ea2a3c01cd9bc0d9618875b20326b27c5359e73355b96" Apr 22 19:54:01.261325 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:01.261306 2574 scope.go:117] "RemoveContainer" containerID="b1fd22f8ba216638f87ea2a3c01cd9bc0d9618875b20326b27c5359e73355b96" Apr 22 19:54:01.261596 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:54:01.261576 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1fd22f8ba216638f87ea2a3c01cd9bc0d9618875b20326b27c5359e73355b96\": container with ID starting with b1fd22f8ba216638f87ea2a3c01cd9bc0d9618875b20326b27c5359e73355b96 not found: ID does not exist" containerID="b1fd22f8ba216638f87ea2a3c01cd9bc0d9618875b20326b27c5359e73355b96" Apr 22 19:54:01.261681 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:01.261605 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1fd22f8ba216638f87ea2a3c01cd9bc0d9618875b20326b27c5359e73355b96"} err="failed to get container status \"b1fd22f8ba216638f87ea2a3c01cd9bc0d9618875b20326b27c5359e73355b96\": rpc error: code = NotFound desc = could not find container \"b1fd22f8ba216638f87ea2a3c01cd9bc0d9618875b20326b27c5359e73355b96\": container with ID starting with b1fd22f8ba216638f87ea2a3c01cd9bc0d9618875b20326b27c5359e73355b96 not found: ID does not exist" Apr 22 19:54:01.273600 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:01.273575 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7d5c8-predictor-595fbbfc94-q5ghv"] Apr 22 19:54:01.277281 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:01.277255 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7d5c8-predictor-595fbbfc94-q5ghv"] Apr 22 19:54:02.429553 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:02.429514 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="611955a8-9b65-46d1-a592-5a176628ebed" path="/var/lib/kubelet/pods/611955a8-9b65-46d1-a592-5a176628ebed/volumes" Apr 22 19:54:03.095990 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:03.095957 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk" Apr 22 19:54:09.246856 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:09.246794 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4" podUID="37726911-fdc0-4a6e-86e4-4aba23aa46d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 19:54:19.246723 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:19.246680 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4" podUID="37726911-fdc0-4a6e-86e4-4aba23aa46d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 19:54:29.247273 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:29.247223 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4" podUID="37726911-fdc0-4a6e-86e4-4aba23aa46d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 19:54:31.730564 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:31.730533 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk"] Apr 22 19:54:31.731020 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:31.730787 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk" podUID="a2d2887f-0262-4abc-a637-10a452c2131b" containerName="kserve-container" containerID="cri-o://6849d3dd6aab5d92e7e774da24592b68c06dd9ebd7412b243bcac0f263328225" gracePeriod=30 Apr 22 19:54:31.861166 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:31.861123 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0c611-predictor-54f6b9dc8b-hmvmh"] Apr 22 19:54:31.861617 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:31.861603 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="611955a8-9b65-46d1-a592-5a176628ebed" containerName="kserve-container" Apr 22 19:54:31.861671 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:31.861621 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="611955a8-9b65-46d1-a592-5a176628ebed" containerName="kserve-container" Apr 22 19:54:31.861707 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:31.861689 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="611955a8-9b65-46d1-a592-5a176628ebed" containerName="kserve-container" Apr 22 19:54:31.866278 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:31.866254 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0c611-predictor-54f6b9dc8b-hmvmh" Apr 22 19:54:31.872380 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:31.872316 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0c611-predictor-54f6b9dc8b-hmvmh"] Apr 22 19:54:31.880726 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:31.880705 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0c611-predictor-54f6b9dc8b-hmvmh" Apr 22 19:54:32.022301 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:32.022260 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0c611-predictor-54f6b9dc8b-hmvmh"] Apr 22 19:54:32.025565 ip-10-0-129-145 kubenswrapper[2574]: W0422 19:54:32.025536 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ca81ae3_fece_4202_8a80_3da8ca59a7e1.slice/crio-1b3db5eee51b177566da9068a2010b4d92930b8d1ba08b4f3d06fd8bc6040713 WatchSource:0}: Error finding container 1b3db5eee51b177566da9068a2010b4d92930b8d1ba08b4f3d06fd8bc6040713: Status 404 returned error can't find the container with id 1b3db5eee51b177566da9068a2010b4d92930b8d1ba08b4f3d06fd8bc6040713 Apr 22 19:54:32.352271 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:32.352175 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0c611-predictor-54f6b9dc8b-hmvmh" event={"ID":"5ca81ae3-fece-4202-8a80-3da8ca59a7e1","Type":"ContainerStarted","Data":"dc39f3370129889a6d2395042381437d0c9bc9209567dca9f8e9c612cc89b543"} Apr 22 19:54:32.352271 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:32.352217 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0c611-predictor-54f6b9dc8b-hmvmh" event={"ID":"5ca81ae3-fece-4202-8a80-3da8ca59a7e1","Type":"ContainerStarted","Data":"1b3db5eee51b177566da9068a2010b4d92930b8d1ba08b4f3d06fd8bc6040713"} Apr 22 19:54:32.352487 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:32.352318 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-0c611-predictor-54f6b9dc8b-hmvmh" Apr 22 19:54:32.353701 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:32.353675 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0c611-predictor-54f6b9dc8b-hmvmh" podUID="5ca81ae3-fece-4202-8a80-3da8ca59a7e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 19:54:32.366543 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:32.366495 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-0c611-predictor-54f6b9dc8b-hmvmh" podStartSLOduration=1.366480067 podStartE2EDuration="1.366480067s" podCreationTimestamp="2026-04-22 19:54:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:54:32.365933773 +0000 UTC m=+1864.485313081" watchObservedRunningTime="2026-04-22 19:54:32.366480067 +0000 UTC m=+1864.485859376" Apr 22 19:54:33.095398 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:33.095348 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk" podUID="a2d2887f-0262-4abc-a637-10a452c2131b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:54:33.356270 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:33.356179 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0c611-predictor-54f6b9dc8b-hmvmh" podUID="5ca81ae3-fece-4202-8a80-3da8ca59a7e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 19:54:34.969929 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:34.969906 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk" Apr 22 19:54:35.364249 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:35.364213 2574 generic.go:358] "Generic (PLEG): container finished" podID="a2d2887f-0262-4abc-a637-10a452c2131b" containerID="6849d3dd6aab5d92e7e774da24592b68c06dd9ebd7412b243bcac0f263328225" exitCode=0 Apr 22 19:54:35.364415 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:35.364268 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk" Apr 22 19:54:35.364415 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:35.364285 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk" event={"ID":"a2d2887f-0262-4abc-a637-10a452c2131b","Type":"ContainerDied","Data":"6849d3dd6aab5d92e7e774da24592b68c06dd9ebd7412b243bcac0f263328225"} Apr 22 19:54:35.364415 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:35.364310 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk" event={"ID":"a2d2887f-0262-4abc-a637-10a452c2131b","Type":"ContainerDied","Data":"266c4975ea1776aea5d22499942a1bd6bb0ba7e347153e16438e743ca6e5b683"} Apr 22 19:54:35.364415 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:35.364325 2574 scope.go:117] "RemoveContainer" containerID="6849d3dd6aab5d92e7e774da24592b68c06dd9ebd7412b243bcac0f263328225" Apr 22 19:54:35.372266 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:35.372246 2574 scope.go:117] "RemoveContainer" containerID="6849d3dd6aab5d92e7e774da24592b68c06dd9ebd7412b243bcac0f263328225" Apr 22 19:54:35.372510 ip-10-0-129-145 kubenswrapper[2574]: E0422 19:54:35.372492 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6849d3dd6aab5d92e7e774da24592b68c06dd9ebd7412b243bcac0f263328225\": container with ID starting with 6849d3dd6aab5d92e7e774da24592b68c06dd9ebd7412b243bcac0f263328225 not found: ID does not exist" containerID="6849d3dd6aab5d92e7e774da24592b68c06dd9ebd7412b243bcac0f263328225" Apr 22 19:54:35.372544 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:35.372520 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6849d3dd6aab5d92e7e774da24592b68c06dd9ebd7412b243bcac0f263328225"} err="failed to get container status \"6849d3dd6aab5d92e7e774da24592b68c06dd9ebd7412b243bcac0f263328225\": rpc error: code = NotFound desc = could not find container \"6849d3dd6aab5d92e7e774da24592b68c06dd9ebd7412b243bcac0f263328225\": container with ID starting with 6849d3dd6aab5d92e7e774da24592b68c06dd9ebd7412b243bcac0f263328225 not found: ID does not exist" Apr 22 19:54:35.384441 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:35.384415 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk"] Apr 22 19:54:35.387540 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:35.387515 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-44fda-predictor-68df868f8b-md6rk"] Apr 22 19:54:36.429187 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:36.429155 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2d2887f-0262-4abc-a637-10a452c2131b" path="/var/lib/kubelet/pods/a2d2887f-0262-4abc-a637-10a452c2131b/volumes" Apr 22 19:54:39.246856 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:39.246787 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4" podUID="37726911-fdc0-4a6e-86e4-4aba23aa46d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 19:54:43.356776 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:43.356735 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0c611-predictor-54f6b9dc8b-hmvmh" podUID="5ca81ae3-fece-4202-8a80-3da8ca59a7e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 19:54:49.247136 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:49.247057 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4" Apr 22 19:54:53.356876 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:54:53.356797 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0c611-predictor-54f6b9dc8b-hmvmh" podUID="5ca81ae3-fece-4202-8a80-3da8ca59a7e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 19:55:03.357069 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:55:03.357023 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0c611-predictor-54f6b9dc8b-hmvmh" podUID="5ca81ae3-fece-4202-8a80-3da8ca59a7e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 19:55:13.357031 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:55:13.356987 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0c611-predictor-54f6b9dc8b-hmvmh" podUID="5ca81ae3-fece-4202-8a80-3da8ca59a7e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 19:55:23.357904 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:55:23.357868 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-0c611-predictor-54f6b9dc8b-hmvmh" Apr 22 19:58:28.479024 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:58:28.478995 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-acl-logging/0.log" Apr 22 19:58:28.484951 ip-10-0-129-145 kubenswrapper[2574]: I0422 19:58:28.484929 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-acl-logging/0.log" Apr 22 20:03:28.499249 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:03:28.499224 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-acl-logging/0.log" Apr 22 20:03:28.505211 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:03:28.505189 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-acl-logging/0.log" Apr 22 20:03:56.761489 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:03:56.761407 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0c611-predictor-54f6b9dc8b-hmvmh"] Apr 22 20:03:56.761973 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:03:56.761630 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-0c611-predictor-54f6b9dc8b-hmvmh" podUID="5ca81ae3-fece-4202-8a80-3da8ca59a7e1" containerName="kserve-container" containerID="cri-o://dc39f3370129889a6d2395042381437d0c9bc9209567dca9f8e9c612cc89b543" gracePeriod=30 Apr 22 20:04:00.009243 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:04:00.009217 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0c611-predictor-54f6b9dc8b-hmvmh" Apr 22 20:04:00.165548 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:04:00.165455 2574 generic.go:358] "Generic (PLEG): container finished" podID="5ca81ae3-fece-4202-8a80-3da8ca59a7e1" containerID="dc39f3370129889a6d2395042381437d0c9bc9209567dca9f8e9c612cc89b543" exitCode=0 Apr 22 20:04:00.165548 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:04:00.165506 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0c611-predictor-54f6b9dc8b-hmvmh" event={"ID":"5ca81ae3-fece-4202-8a80-3da8ca59a7e1","Type":"ContainerDied","Data":"dc39f3370129889a6d2395042381437d0c9bc9209567dca9f8e9c612cc89b543"} Apr 22 20:04:00.165548 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:04:00.165525 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0c611-predictor-54f6b9dc8b-hmvmh" Apr 22 20:04:00.165548 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:04:00.165541 2574 scope.go:117] "RemoveContainer" containerID="dc39f3370129889a6d2395042381437d0c9bc9209567dca9f8e9c612cc89b543" Apr 22 20:04:00.165880 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:04:00.165531 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0c611-predictor-54f6b9dc8b-hmvmh" event={"ID":"5ca81ae3-fece-4202-8a80-3da8ca59a7e1","Type":"ContainerDied","Data":"1b3db5eee51b177566da9068a2010b4d92930b8d1ba08b4f3d06fd8bc6040713"} Apr 22 20:04:00.174166 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:04:00.174051 2574 scope.go:117] "RemoveContainer" containerID="dc39f3370129889a6d2395042381437d0c9bc9209567dca9f8e9c612cc89b543" Apr 22 20:04:00.174369 ip-10-0-129-145 kubenswrapper[2574]: E0422 20:04:00.174349 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc39f3370129889a6d2395042381437d0c9bc9209567dca9f8e9c612cc89b543\": container with ID starting with dc39f3370129889a6d2395042381437d0c9bc9209567dca9f8e9c612cc89b543 not found: ID does not exist" containerID="dc39f3370129889a6d2395042381437d0c9bc9209567dca9f8e9c612cc89b543" Apr 22 20:04:00.174428 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:04:00.174380 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc39f3370129889a6d2395042381437d0c9bc9209567dca9f8e9c612cc89b543"} err="failed to get container status \"dc39f3370129889a6d2395042381437d0c9bc9209567dca9f8e9c612cc89b543\": rpc error: code = NotFound desc = could not find container \"dc39f3370129889a6d2395042381437d0c9bc9209567dca9f8e9c612cc89b543\": container with ID starting with dc39f3370129889a6d2395042381437d0c9bc9209567dca9f8e9c612cc89b543 not found: ID does not exist" Apr 22 20:04:00.188160 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:04:00.188134 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0c611-predictor-54f6b9dc8b-hmvmh"] Apr 22 20:04:00.191737 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:04:00.191714 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0c611-predictor-54f6b9dc8b-hmvmh"] Apr 22 20:04:00.429662 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:04:00.429590 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ca81ae3-fece-4202-8a80-3da8ca59a7e1" path="/var/lib/kubelet/pods/5ca81ae3-fece-4202-8a80-3da8ca59a7e1/volumes" Apr 22 20:08:28.521149 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:08:28.521034 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-acl-logging/0.log" Apr 22 20:08:28.527257 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:08:28.527237 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-acl-logging/0.log" Apr 22 20:11:26.946030 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:11:26.945948 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4"] Apr 22 20:11:26.946603 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:11:26.946244 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4" podUID="37726911-fdc0-4a6e-86e4-4aba23aa46d1" containerName="kserve-container" containerID="cri-o://3a20211e65424d7a77f65fc50fc6298058bf2a467a85dc4cc1188cb7e9446745" gracePeriod=30 Apr 22 20:11:29.246518 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:11:29.246475 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4" podUID="37726911-fdc0-4a6e-86e4-4aba23aa46d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 20:11:29.979075 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:11:29.979053 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4" Apr 22 20:11:30.553020 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:11:30.552987 2574 generic.go:358] "Generic (PLEG): container finished" podID="37726911-fdc0-4a6e-86e4-4aba23aa46d1" containerID="3a20211e65424d7a77f65fc50fc6298058bf2a467a85dc4cc1188cb7e9446745" exitCode=0 Apr 22 20:11:30.553470 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:11:30.553047 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4" Apr 22 20:11:30.553470 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:11:30.553052 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4" event={"ID":"37726911-fdc0-4a6e-86e4-4aba23aa46d1","Type":"ContainerDied","Data":"3a20211e65424d7a77f65fc50fc6298058bf2a467a85dc4cc1188cb7e9446745"} Apr 22 20:11:30.553470 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:11:30.553080 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4" event={"ID":"37726911-fdc0-4a6e-86e4-4aba23aa46d1","Type":"ContainerDied","Data":"b446184c24aef2f1595b5d0f7d98544cbf857289775f174696426994c38165f7"} Apr 22 20:11:30.553470 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:11:30.553096 2574 scope.go:117] "RemoveContainer" containerID="3a20211e65424d7a77f65fc50fc6298058bf2a467a85dc4cc1188cb7e9446745" Apr 22 20:11:30.560975 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:11:30.560943 2574 scope.go:117] "RemoveContainer" containerID="3a20211e65424d7a77f65fc50fc6298058bf2a467a85dc4cc1188cb7e9446745" Apr 22 20:11:30.561203 ip-10-0-129-145 kubenswrapper[2574]: E0422 20:11:30.561182 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a20211e65424d7a77f65fc50fc6298058bf2a467a85dc4cc1188cb7e9446745\": container with ID starting with 3a20211e65424d7a77f65fc50fc6298058bf2a467a85dc4cc1188cb7e9446745 not found: ID does not exist" containerID="3a20211e65424d7a77f65fc50fc6298058bf2a467a85dc4cc1188cb7e9446745" Apr 22 20:11:30.561254 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:11:30.561213 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a20211e65424d7a77f65fc50fc6298058bf2a467a85dc4cc1188cb7e9446745"} err="failed to get container status \"3a20211e65424d7a77f65fc50fc6298058bf2a467a85dc4cc1188cb7e9446745\": rpc error: code = NotFound desc = could not find container \"3a20211e65424d7a77f65fc50fc6298058bf2a467a85dc4cc1188cb7e9446745\": container with ID starting with 3a20211e65424d7a77f65fc50fc6298058bf2a467a85dc4cc1188cb7e9446745 not found: ID does not exist" Apr 22 20:11:30.569134 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:11:30.569110 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4"] Apr 22 20:11:30.572396 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:11:30.572365 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-65828-predictor-7fc45d4b74-sspp4"] Apr 22 20:11:32.429404 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:11:32.429358 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37726911-fdc0-4a6e-86e4-4aba23aa46d1" path="/var/lib/kubelet/pods/37726911-fdc0-4a6e-86e4-4aba23aa46d1/volumes" Apr 22 20:11:54.873838 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:11:54.873786 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-k48jx_793d5ba1-c977-4404-bd38-8b78e8e5e191/global-pull-secret-syncer/0.log" Apr 22 20:11:54.993634 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:11:54.993603 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-42mdb_76e397fd-d6b3-4cfb-aa90-fb57dfa68ba4/konnectivity-agent/0.log" Apr 22 20:11:55.069715 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:11:55.069683 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-145.ec2.internal_b71282cca731aa5ddeb9357344f86ebf/haproxy/0.log" Apr 22 20:11:58.806311 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:11:58.806276 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-q6jjs_5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d/node-exporter/0.log" Apr 22 20:11:58.822427 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:11:58.822401 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-q6jjs_5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d/kube-rbac-proxy/0.log" Apr 22 20:11:58.840772 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:11:58.840748 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-q6jjs_5c86ce86-5a72-4a8d-8e4f-42bc351d2b4d/init-textfile/0.log" Apr 22 20:11:59.146980 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:11:59.146900 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-kkgq2_0916d097-426d-4179-a5d0-b4cbdbeb9c21/prometheus-operator-admission-webhook/0.log" Apr 22 20:12:01.295985 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:01.295955 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-chhcp_9656e049-2948-47bd-aec9-0bf4e3612f24/download-server/0.log" Apr 22 20:12:02.120688 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.120655 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws"] Apr 22 20:12:02.120977 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.120964 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="37726911-fdc0-4a6e-86e4-4aba23aa46d1" containerName="kserve-container" Apr 22 20:12:02.121035 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.120978 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="37726911-fdc0-4a6e-86e4-4aba23aa46d1" containerName="kserve-container" Apr 22 20:12:02.121035 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.120992 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ca81ae3-fece-4202-8a80-3da8ca59a7e1" containerName="kserve-container" Apr 22 20:12:02.121035 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.120998 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ca81ae3-fece-4202-8a80-3da8ca59a7e1" containerName="kserve-container" Apr 22 20:12:02.121035 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.121014 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2d2887f-0262-4abc-a637-10a452c2131b" containerName="kserve-container" Apr 22 20:12:02.121035 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.121020 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d2887f-0262-4abc-a637-10a452c2131b" containerName="kserve-container" Apr 22 20:12:02.121184 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.121065 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="a2d2887f-0262-4abc-a637-10a452c2131b" containerName="kserve-container" Apr 22 20:12:02.121184 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.121074 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ca81ae3-fece-4202-8a80-3da8ca59a7e1" containerName="kserve-container" Apr 22 20:12:02.121184 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.121081 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="37726911-fdc0-4a6e-86e4-4aba23aa46d1" containerName="kserve-container" Apr 22 20:12:02.123920 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.123904 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws" Apr 22 20:12:02.126498 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.126477 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-94jtb\"/\"openshift-service-ca.crt\"" Apr 22 20:12:02.126609 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.126477 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-94jtb\"/\"default-dockercfg-btnhf\"" Apr 22 20:12:02.127561 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.127548 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-94jtb\"/\"kube-root-ca.crt\"" Apr 22 20:12:02.133463 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.133442 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws"] Apr 22 20:12:02.210044 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.210006 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0a331a65-c4aa-4b2e-bc87-738031fc323c-podres\") pod \"perf-node-gather-daemonset-gmkws\" (UID: \"0a331a65-c4aa-4b2e-bc87-738031fc323c\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws" Apr 22 20:12:02.210044 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.210042 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0a331a65-c4aa-4b2e-bc87-738031fc323c-lib-modules\") pod \"perf-node-gather-daemonset-gmkws\" (UID: \"0a331a65-c4aa-4b2e-bc87-738031fc323c\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws" Apr 22 20:12:02.210246 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.210115 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0a331a65-c4aa-4b2e-bc87-738031fc323c-sys\") pod \"perf-node-gather-daemonset-gmkws\" (UID: \"0a331a65-c4aa-4b2e-bc87-738031fc323c\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws" Apr 22 20:12:02.210246 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.210146 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0a331a65-c4aa-4b2e-bc87-738031fc323c-proc\") pod \"perf-node-gather-daemonset-gmkws\" (UID: \"0a331a65-c4aa-4b2e-bc87-738031fc323c\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws" Apr 22 20:12:02.210246 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.210189 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74lc5\" (UniqueName: \"kubernetes.io/projected/0a331a65-c4aa-4b2e-bc87-738031fc323c-kube-api-access-74lc5\") pod \"perf-node-gather-daemonset-gmkws\" (UID: \"0a331a65-c4aa-4b2e-bc87-738031fc323c\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws" Apr 22 20:12:02.311346 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.311307 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0a331a65-c4aa-4b2e-bc87-738031fc323c-lib-modules\") pod \"perf-node-gather-daemonset-gmkws\" (UID: \"0a331a65-c4aa-4b2e-bc87-738031fc323c\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws" Apr 22 20:12:02.311770 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.311378 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0a331a65-c4aa-4b2e-bc87-738031fc323c-sys\") pod \"perf-node-gather-daemonset-gmkws\" (UID: \"0a331a65-c4aa-4b2e-bc87-738031fc323c\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws" Apr 22 20:12:02.311770 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.311406 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0a331a65-c4aa-4b2e-bc87-738031fc323c-proc\") pod \"perf-node-gather-daemonset-gmkws\" (UID: \"0a331a65-c4aa-4b2e-bc87-738031fc323c\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws" Apr 22 20:12:02.311770 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.311454 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74lc5\" (UniqueName: \"kubernetes.io/projected/0a331a65-c4aa-4b2e-bc87-738031fc323c-kube-api-access-74lc5\") pod \"perf-node-gather-daemonset-gmkws\" (UID: \"0a331a65-c4aa-4b2e-bc87-738031fc323c\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws" Apr 22 20:12:02.311770 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.311462 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0a331a65-c4aa-4b2e-bc87-738031fc323c-sys\") pod \"perf-node-gather-daemonset-gmkws\" (UID: \"0a331a65-c4aa-4b2e-bc87-738031fc323c\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws" Apr 22 20:12:02.311770 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.311509 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0a331a65-c4aa-4b2e-bc87-738031fc323c-lib-modules\") pod \"perf-node-gather-daemonset-gmkws\" (UID: \"0a331a65-c4aa-4b2e-bc87-738031fc323c\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws" Apr 22 20:12:02.311770 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.311518 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0a331a65-c4aa-4b2e-bc87-738031fc323c-podres\") pod \"perf-node-gather-daemonset-gmkws\" (UID: \"0a331a65-c4aa-4b2e-bc87-738031fc323c\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws" Apr 22 20:12:02.311770 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.311542 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0a331a65-c4aa-4b2e-bc87-738031fc323c-proc\") pod \"perf-node-gather-daemonset-gmkws\" (UID: \"0a331a65-c4aa-4b2e-bc87-738031fc323c\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws" Apr 22 20:12:02.311770 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.311590 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0a331a65-c4aa-4b2e-bc87-738031fc323c-podres\") pod \"perf-node-gather-daemonset-gmkws\" (UID: \"0a331a65-c4aa-4b2e-bc87-738031fc323c\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws" Apr 22 20:12:02.319821 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.319786 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74lc5\" (UniqueName: \"kubernetes.io/projected/0a331a65-c4aa-4b2e-bc87-738031fc323c-kube-api-access-74lc5\") pod \"perf-node-gather-daemonset-gmkws\" (UID: \"0a331a65-c4aa-4b2e-bc87-738031fc323c\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws" Apr 22 20:12:02.331062 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.331032 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jb2x5_e544a780-e5f6-409b-b57c-0e80f0766bb1/dns/0.log" Apr 22 20:12:02.349882 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.349852 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jb2x5_e544a780-e5f6-409b-b57c-0e80f0766bb1/kube-rbac-proxy/0.log" Apr 22 20:12:02.405388 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.405314 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cpzbb_b1c0b054-80a0-4cc5-b053-a4d99268aa8f/dns-node-resolver/0.log" Apr 22 20:12:02.433761 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.433729 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws" Apr 22 20:12:02.553755 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.553722 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws"] Apr 22 20:12:02.556926 ip-10-0-129-145 kubenswrapper[2574]: W0422 20:12:02.556900 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0a331a65_c4aa_4b2e_bc87_738031fc323c.slice/crio-88ef8bedbddb245875fd6be6d84149377baf6cc4c8de94fe034a122842030439 WatchSource:0}: Error finding container 88ef8bedbddb245875fd6be6d84149377baf6cc4c8de94fe034a122842030439: Status 404 returned error can't find the container with id 88ef8bedbddb245875fd6be6d84149377baf6cc4c8de94fe034a122842030439 Apr 22 20:12:02.558633 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.558610 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:12:02.656129 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.656034 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws" event={"ID":"0a331a65-c4aa-4b2e-bc87-738031fc323c","Type":"ContainerStarted","Data":"aa64e94a257fd34a64f2d31300f79febb5fee606524e2445cd366dac7bc82b12"} Apr 22 20:12:02.656129 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.656080 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws" event={"ID":"0a331a65-c4aa-4b2e-bc87-738031fc323c","Type":"ContainerStarted","Data":"88ef8bedbddb245875fd6be6d84149377baf6cc4c8de94fe034a122842030439"} Apr 22 20:12:02.656129 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.656111 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws" Apr 22 20:12:02.672486 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.672433 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws" podStartSLOduration=0.672417281 podStartE2EDuration="672.417281ms" podCreationTimestamp="2026-04-22 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:12:02.671114343 +0000 UTC m=+2914.790493653" watchObservedRunningTime="2026-04-22 20:12:02.672417281 +0000 UTC m=+2914.791796590" Apr 22 20:12:02.803739 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.803695 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-694bf65ddb-9n9qb_1cc7575c-4a73-4478-a11a-0933bcca8694/registry/0.log" Apr 22 20:12:02.822132 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:02.822103 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jkcf2_91147599-bdf0-49f5-98ed-a3567eaf56db/node-ca/0.log" Apr 22 20:12:03.817107 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:03.817079 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-kn8hz_33fcaecf-093b-4a6b-9bed-0310951d4825/serve-healthcheck-canary/0.log" Apr 22 20:12:04.144487 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:04.144394 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9t2qh_80b67f15-b534-4bb5-98c8-6566228be090/kube-rbac-proxy/0.log" Apr 22 20:12:04.160589 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:04.160566 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9t2qh_80b67f15-b534-4bb5-98c8-6566228be090/exporter/0.log" Apr 22 20:12:04.177780 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:04.177751 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9t2qh_80b67f15-b534-4bb5-98c8-6566228be090/extractor/0.log" Apr 22 20:12:06.193978 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:06.193946 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-545d8995fb-w2j7l_c1f8dc15-45b3-4be9-baf3-9d624a993e5f/manager/0.log" Apr 22 20:12:06.210818 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:06.210777 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-vgt72_d3a161b2-9b3a-40d3-8e67-dd6f929f7713/manager/0.log" Apr 22 20:12:06.667341 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:06.667312 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-pzrzc_6708deef-56e8-47d4-a361-88893b47d60c/manager/0.log" Apr 22 20:12:06.708594 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:06.708563 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-hmzs7_a82ba81b-2e61-49b3-b439-16a1541e4352/seaweedfs/0.log" Apr 22 20:12:08.668736 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:08.668708 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-gmkws" Apr 22 20:12:09.967188 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:09.967154 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-d2sdz_80f993c4-91be-4779-90b0-4d41d6f29f4e/migrator/0.log" Apr 22 20:12:10.002093 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:10.002066 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-d2sdz_80f993c4-91be-4779-90b0-4d41d6f29f4e/graceful-termination/0.log" Apr 22 20:12:11.232918 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:11.232821 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8z9zt_32dd8967-d559-441e-95e3-6faf8bc49253/kube-multus/0.log" Apr 22 20:12:11.395529 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:11.395487 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dcctc_b948da6e-8c3e-4892-92f1-4f59d7c5c885/kube-multus-additional-cni-plugins/0.log" Apr 22 20:12:11.417249 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:11.417211 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dcctc_b948da6e-8c3e-4892-92f1-4f59d7c5c885/egress-router-binary-copy/0.log" Apr 22 20:12:11.441882 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:11.441859 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dcctc_b948da6e-8c3e-4892-92f1-4f59d7c5c885/cni-plugins/0.log" Apr 22 20:12:11.461064 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:11.461038 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dcctc_b948da6e-8c3e-4892-92f1-4f59d7c5c885/bond-cni-plugin/0.log" Apr 22 20:12:11.478776 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:11.478751 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dcctc_b948da6e-8c3e-4892-92f1-4f59d7c5c885/routeoverride-cni/0.log" Apr 22 20:12:11.496693 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:11.496674 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dcctc_b948da6e-8c3e-4892-92f1-4f59d7c5c885/whereabouts-cni-bincopy/0.log" Apr 22 20:12:11.512650 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:11.512630 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dcctc_b948da6e-8c3e-4892-92f1-4f59d7c5c885/whereabouts-cni/0.log" Apr 22 20:12:11.785569 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:11.785486 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nndbq_1c461896-346c-4de1-9362-b9f83bd3486d/network-metrics-daemon/0.log" Apr 22 20:12:11.804534 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:11.804501 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nndbq_1c461896-346c-4de1-9362-b9f83bd3486d/kube-rbac-proxy/0.log" Apr 22 20:12:12.668026 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:12.667995 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-controller/0.log" Apr 22 20:12:12.681136 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:12.681107 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-acl-logging/0.log" Apr 22 20:12:12.706697 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:12.706667 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovn-acl-logging/1.log" Apr 22 20:12:12.723566 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:12.723538 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/kube-rbac-proxy-node/0.log" Apr 22 20:12:12.739864 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:12.739832 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 20:12:12.754352 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:12.754318 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/northd/0.log" Apr 22 20:12:12.769506 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:12.769482 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/nbdb/0.log" Apr 22 20:12:12.787832 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:12.787783 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/sbdb/0.log" Apr 22 20:12:12.948834 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:12.948741 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rsnsl_5879d8e5-623a-4ec2-9a22-b0b6c0c5917b/ovnkube-controller/0.log" Apr 22 20:12:14.069352 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:14.069321 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-bllx4_cbfd8869-819e-45c7-9536-08c72a48f2c3/network-check-target-container/0.log" Apr 22 20:12:14.866620 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:14.866585 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-85tpl_7dbaab45-2adf-4e5c-b969-f8d3eb83ea37/iptables-alerter/0.log" Apr 22 20:12:15.412147 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:15.412111 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-7kfks_0fdac791-5aa3-4153-bb07-34cec3dbf296/tuned/0.log" Apr 22 20:12:18.095460 ip-10-0-129-145 kubenswrapper[2574]: I0422 20:12:18.095429 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-865cb79987-c5cb4_4425d89c-2d75-40bb-90fd-74877683a094/service-ca-controller/0.log"